github: https://github.com/wendi-white/biol607_homework

#libraries
library(rsample)
library(purrr)
library(dplyr)
## 
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
## 
##     filter, lag
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, setequal, union
library(ggplot2)
library(tidyr)
library(modelr)
library(tidyr)
library(dplyr)
library(ggplot2)
library(stats)
library(rstan)
## Loading required package: StanHeaders
## rstan (Version 2.21.1, GitRev: 2e1f913d3ca3)
## For execution on a local, multicore CPU with excess RAM we recommend calling
## options(mc.cores = parallel::detectCores()).
## To avoid recompilation of unchanged Stan programs, we recommend calling
## rstan_options(auto_write = TRUE)
## 
## Attaching package: 'rstan'
## The following object is masked from 'package:tidyr':
## 
##     extract
library(brms)
## Loading required package: Rcpp
## 
## Attaching package: 'Rcpp'
## The following object is masked from 'package:rsample':
## 
##     populate
## Loading 'brms' package (version 2.14.0). Useful instructions
## can be found by typing help('brms'). A more detailed introduction
## to the package is available through vignette('brms_overview').
## 
## Attaching package: 'brms'
## The following object is masked from 'package:rstan':
## 
##     loo
## The following object is masked from 'package:stats':
## 
##     ar
library(MASS)
## 
## Attaching package: 'MASS'
## The following object is masked from 'package:dplyr':
## 
##     select
library(gganimate)
library(bayesplot)
## This is bayesplot version 1.7.2
## - Online documentation and vignettes at mc-stan.org/bayesplot
## - bayesplot theme set to bayesplot::theme_default()
##    * Does _not_ affect other ggplot2 plots
##    * See ?bayesplot_theme_set for details on theme setting
library(profileModel)
library(AICcmodavg)
library(compare)
## 
## Attaching package: 'compare'
## The following object is masked from 'package:base':
## 
##     isTRUE
library(tidybayes)
## 
## Attaching package: 'tidybayes'
## The following objects are masked from 'package:brms':
## 
##     dstudent_t, pstudent_t, qstudent_t, rstudent_t
library(rsample)
library(boot)
library(modelr)
  1. Sampling your system (10 points) Each of you has a study system your work in and a question of interest. Give an example of one variable that you would sample in order to get a sense of its variation in nature. Describe, in detail, how you would sample for the population of that variable in order to understand its distribution.Questions to consider include, but are not limited to: Just what is your sample versus your population? What would your sampling design be? Why would you design it that particular way? What are potential confounding influences of both sampling technique and sample design that you need to be careful to avoid? What statistical distribution might the variable take, and why?

One variable I would sample is wrack abundance in order to get its variation across time (seasons/years) and elevation. To sample for a population I would want to do sample transects across high, mid and low elevations at the low tide in the marsh and take data during the low tides during peak wrack season (spring) for maybe about 1 week. I’d imagine wanting to observe the back to back low tides despite lighting issues so I could mark and understand true max rack replacement. This would give me an idea about wrack replacement and if it stays consistent over tides (also track how high each tide is). After peak season I would likely sample once every month until winter when snow started to compile and then start sampling again after the winter months and resume heavy sampling during the spring again. Likely the largest impacts of wrack will happen during spring and it will be essential to have accurate measurements of wrack abundance.

When I go out to sample I would use MARINe protocols for my transects. At the start of the experiment I would set up 10 permanent transects grids that are 100m long and travel back from the creek edges. I can use GPS points or permanent markers (PVC) to set the transects. We’d take a measurement at every 0.5m and record if there was wrack (classified as fresh vs old and also wrack species type) or none.

Answering the questions presented after this brief layout: 1. Just what is your sample versus your population? My sample is the ten transects that start from the creek bank but my population is wrack across the entirety of the marsh to look at its distribution.

  1. What would your sampling design be? Sampling across 10 permanent 100m transects. Data points tell us if there is wrack or not. If there is wrack what species is it? What state is it in (fresh vs old)?

  2. Why would you design it that particular way? I’d design it in this way so I could get an overall idea of the distribution of wrack across the marsh. Using fixed plots will also let us look at wrack replacement during peak season/ over different high tides.

  3. What are potential confounding influences of both sampling technique and sample design that you need to be careful to avoid? Nothing, it’s perfect :) Just kidding. When sampling I’d need to be sure that what I define as “fresh” vs “old” is clear. This would take some field observations and a literature dive as I’m newer to this system. Based on my previous experiences, kelp is old when it’s dried out and/or browned.

  4. What statistical distribution might the variable take, and why? I would imagine that we see differences in wrack abundance maybe across tides but almost certainly across seasons due to higher production of algae during the spring. Also, I would imagine that we would see higher wrack deposits in the high intertidal b/c replacement is dependent on the next high tide and if it will be swept back out or not. The amount of time wrack spends in an area also adds to how long it has to decompose/subsidize the local area.

  1. Data Reshaping and Visualization. Johns Hopkins has been maintaining one of the best Covid-19 timeseries data sets out there. The data on the US can be found “https://github.com/CSSEGISandData/COVID-19/blob/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_confirmed_US.csv” with information about what is in the data at “https://github.com/CSSEGISandData/COVID-19/tree/master/csse_covid_19_data

2a) Access (5 points) Download and read in the data. Can you do this without downloading, but read directly from the archive (+1)

covid_confirmed <-  read.csv("https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_confirmed_US.csv", check.names = F) #load data but take out the X before all the dates

covid_confirmed<- covid_confirmed[,c(6,7,12:298)] #pull just the admin, state and date columns

2b) It’s big and wide! (10 Points) The data is, well, huge. It’s also wide, with dates as columns. Write a function that, given a state, will output a time series (long data) of cumulative cases in that state as well as new daily cases. Note, let’s make the date column that emerges a true date object. Let’s say you’ve called it date_col. If you mutate it, mutate(date_col = lubridate::mdy(date_col)), it will be turned into a date object that will have a recognized order. {lubridate} is da bomb, and I’m hoping we have some time to cover it in the future. +5 extra credit for merging it with some other data source to also return cases per 100,000 people.

func_by_state <- function(state){ #when given a state
  #pivot our data longer so that we get a date's become read rowwise not as columns
  covid_longer <- pivot_longer(covid_confirmed, -c(Province_State, Admin2),
               names_to = "date_col",
               values_to = "cumulative_cases") %>% 
    #creating new date col in form of mdy
    mutate(date_col = lubridate::mdy(date_col))
    #create a new df that calls for a specific state
    specific_state <- covid_longer %>%
      #add in a daily cases col that takes cum cases-cum cases from day before to give you updated daily
      mutate(daily_cases= (cumulative_cases)-(lag(cumulative_cases, k=1)))%>%
      #filter by state
      filter(Province_State == state) %>%
      #summarize with df of state, date, cum cases, and daily cases
      summarize(Province_State, date_col, cumulative_cases, daily_cases)
    #final df that groups by dates so that each state has one reading per day not one reading per county per day
    df <- specific_state %>%
      group_by(date_col)%>%
      summarize(new_daily_cases=sum(daily_cases),
                cumulative_cases= sum(cumulative_cases))%>%
      filter(new_daily_cases >= 0)
  return(df)
}  
mass_df <- func_by_state("Massachusetts") #test for mass
## `summarise()` ungrouping output (override with `.groups` argument)
alaska_df <- func_by_state("Alaska") #test for alaska
## `summarise()` ungrouping output (override with `.groups` argument)

2c) Let’s get visual! (10 Points) Great! Make a compelling plot of the time series for Massachusetts! Points for style, class, ease of understanding major trends, etc. Note, 10/10 only for the most killer figures. Don’t phone it in! Also, note what the data from JHU is. Do you want the cumulative, or daily, or what?

coeff <- 50 #set coeff in order for secondary axis to be of right proportion 

#call in mass df and filter for only cases >0 (noticed one neg number in data and also the top number in the column is funky b/c it's doing a cum-last cum so produces large neg #)
mass_plot <- ggplot(data=mass_df %>%
                      filter(new_daily_cases > 0)%>%
                      filter(cumulative_cases > 0),
                    mapping = aes(x= date_col)) +
  #x is always date but then two y axis with new daily and cumulative
  geom_line(aes(y=new_daily_cases, colour="new_daily_cases")) +
  geom_line(aes(y=cumulative_cases / coeff, colour="cumulative_cases")) + # divide by coeff so it's proportional to the other y scale
  #the continuous y scale needs the sec axis scale to be set by the coeff
  scale_y_continuous(
    #First axis name
    name = "New daily cases" , 
    # Add a second axis and specify name
    sec.axis = sec_axis(~.*coeff, name="Cumulative cases")) +
  #label whole graph
  labs(title= "Changes in new daily annd cumulative cases for Massachusetts",
        subtitle = "Tracking changes from Jan-Nov 2020",
        x= "Time",
        y= "New daily cases")+
  #remove legend title
  theme(legend.title=element_blank())+
      #give lines colors
  scale_color_manual(values=c("#CC6666", "#9999CC"))+
  transition_reveal(date_col) #animate it so it animates by the date
mass_plot

2d) Cool. Now, write a function that will take what you did above, and create a plot for any state - so, I enter Alaska and I get the plot for Alaska! +2 if it can do daily or cumulative cases - or cases per 100,000 if you did that above. +3 EC if you highlight points of interest - but dynamically using the data. Note, you might need to do some funky stuff to make things fit well in the plot for this one. Or, meh.

func_by_state_plot <- function(state, coeff= 50){ #when given a state and coeff (set at 50 which would work for them all)
  ##pivot our data longer so that we get a date's become read rowwise not as columns
  covid_longer <- pivot_longer(covid_confirmed, -c(Province_State, Admin2),
               names_to = "date_col",
               values_to = "cumulative_cases") %>% 
    #creating dates column in mdy format
    mutate(date_col = lubridate::mdy(date_col))
    #create a new df that calls for a specific state
    specific_state <- covid_longer %>%
    #add in a daily cases col that takes cum cases-cum cases from day before to give you updated daily
      mutate(daily_cases= (cumulative_cases)-(lag(cumulative_cases, k=1)))%>%
      #filter by state
      filter(Province_State == state) %>%
      #summarize with df of state, date, cum cases, and daily cases
      summarize(Province_State, date_col, cumulative_cases, daily_cases)
    #final df that groups by dates so that each state has one reading per day not one reading per county per day
    df <- specific_state %>%
      group_by(date_col)%>%
      summarize(new_daily_cases=sum(daily_cases),
                cumulative_cases= sum(cumulative_cases))
    #call in df and filter for only cases >0 (noticed one neg number in data and also the top number in the column is funky b/c it's doing a cum-last cum so produces large neg #)
    plot <- ggplot(data=df %>%
                      filter(new_daily_cases > 0)%>%
                      filter(cumulative_cases > 0),
                   #x is always date but then two y axis with new daily and cumulative
                    mapping = aes(x= date_col))+
  geom_line(aes(y=new_daily_cases, colour="new_daily_cases"))+
  geom_line(aes(y=cumulative_cases / coeff, colour="cumulative_cases"))+ # divide by coeff so it's proportional to the other y scale
  #the continuous y scale needs the sec axis scale to be set by the coeff
  scale_y_continuous(
    #First axis name
    name = "New daily cases",
    # Add a second axis and name it 
    sec.axis = sec_axis(~.*coeff, name="Cumulative cases")) +
  labs(title= "Changes in new daily cases and cumulative cases over time",
        subtitle = state,
        x= "Time",
        y= "New daily cases")+
  #remove legend title
  theme(legend.title=element_blank())+
      #give lines colors
  scale_color_manual(values=c("#CC6666", "#9999CC"))+
  transition_reveal(date_col) #animate it
  return(plot)
}


func_by_state_plot("Alaska") #try with alaska
## `summarise()` ungrouping output (override with `.groups` argument)

  1. Let’s get philosophical. (10 points) We have discussed multiple inferential frameworks this semester. Frequentist NHST, Likelihood and model comparison, Baysian probabilistic thinking, Assessment of Predictive Ability (which spans frameworks!), and more. We’ve talked about Popper and Lakatos. Put these pieces of the puzzle together and look deep within yourself. What do you feel is the inferential framework that you adopt as a scientist? Why? Include in your answer why you prefer the inferential tools (e.g. confidence intervals, test statistics, out-of-sample prediction, posterior probabilities, etc.) of your chosen worldview and why you do not like the ones of the other one. This includes defining just what those different tools mean, as well as relating them to the things you study. extra credit for citing and discussing outside sources - one point per source/point

Well to be quite honest I was a frequentist before starting this class. I very heavily relied on running an analysis or looking at an analysis in a paper and saying oh the p-value is low, reject the null, cool. In part this was all I had been taught so it was quite mind bottling to start thinking about statistics in a new way. In the past all I was worried about was much of what the pregnant meme stands for (ensuring you don’t have Type I- rejecting the null even though it’s true or Type II error- failing to reject the null). 95% confidence intervals were a safe place to start!

This year will really be my first attempt at developing a project with statistics in mind. One of the first times I had really thought about statistical theory was upon the introduction of Lakatos and Popper. I more strongly identify with Lakatos b/c of his idea that we need research programs to get at an overall hard core theory. When I conceptually think about the impacts of subsidies on food webs, there are so many examples that can conclude different results. Sometimes subsidies are good and sometimes they can be bad. We create a body of literature as a scientific community to define and build upon current standing theories in an attempt to paint a bigger picture.

Moving into the way I analyze problems though, I now think I most heavily reside with looking at likelihood and model comparisons. After showing us how our linear models are used to do the same interpretations as t-tests, it all really came together for me. I had been relying on my (narrow) previous knowledge and I think needed the dots to connect before I could really commit to understanding why we run linear models. It’s because everything just IS a linear model!! And now I can finally get behind reading the different graphs and seeing how well our data behaves. And then furthering this by looking at how well my data says something about my system is even COOLER! I always relied on a p-value to tell me how confident I can be in my data and the analysis it produces. Now being able to look at a ggplot, which, or qqplot and knowing what to take away from it I feel confident in this type of analysis.

I will admit that I’m still trying to get a better grasp on utilizing bayesian statistics. I haven’t ever used real data with bayes statistics and so I’m having a hard time applying it to a question. Once I begin an experiment and can begin to think about the impacts of a prior on my analysis I could see myself starting to change my mindset to work more with this theorem. When we did the problem with the sun I can see how taking our priors into account is crucial. I also think that analyzing credible intervals could be highly beneficial to understand the probability of observing an unobserved parameter.

Overall, I see this class helping me grow in the way I think about statistics (and read other papers!). I can foresee my mindset changing as I learn more and develop my own research.

  1. Bayes Theorem (10 points) I’ve referenced the following figure a few times. I’d like you to demonstrate your understanding of Bayes Theorem by hand (e.g. calculate it out and show your work - you can do this all in R, I’m not a monster) showing what is the probability of the sun exploding is given that the device said yes. Assume that your prior probability that the sun explodes is p(Sun Explodes) = 0.0001 (I’ll leave it to you to get p(Sun Doesn’t Explode). The rest of the information you need - and some you don’t - is in the cartoon - p(Yes | Explodes), p(Yes | Doesn’t Explode), p(No | Explodes), p(No | Doesn’t Explode).

The frequentist very heavily relies on the probability of rolling a 6 twice in a row. However, this approach isn’t really answering our question b/c that means we can’t make the argument that we know it’s highly unlikely that the sun is ever going to explode. That is the difference between frequentist and bayes statistics b/c with bayes we can take our prior knowledge into account. It changes it from a 2.7% chance the sun explodes given a yes to a 0.3488141% chance the sun will explode. It’s HIGHLY unlikely!

My math: p(sun exploding | detector says yes) = p(detector says yes | sun exploding) * p(sun exploding) / p(detector says yes)

p(e|d) = p(d|e) p(e) / p(d) = 35/36 0.0001/ 0.02787222 = 0.003488141

p(d|e) lik of observing a yes from the machine given our data that the sun exploded. So this lik is 35/36 b/c the machine only lies to us 1/36 times (1/6 x 1/6= prob of rolling two 6’s)

p(e) prob sun will explode– given as 0.0001

p(d) what is the prob that the detector will detect an explosion. Need to look at the option for the machine correct and explode + machine to lie and explode = ((35/36)0.0001) + ((1 - (35/36))(1 - 0.0001)) = 0.02787222

#p(sun exploding | detector says yes) = p(detector says yes | sun exploding) * p(sun exploding) / p(detector says yes)
#p(e|d)                   =   p(d|e)         *p(e)  / p(d)
#                         =   35/36          *0.0001/ 0.02787222
#                         = 0.003488141


#p(d|e) lik of observing a yes from the machine given our data that the sun exploded. So this lik is 35/36 b/c the machine only lies to us 1/36 times (1/6 x 1/6= prob of rolling two 6's)
                          
#p(e) prob sun will explode-- given as 0.0001

#p(d) what is the prob that the detector will detect an explosion. Need to look at the option for the 
    #machine correct and explode + machine to lie and explode 
    #= ((35/36)*0.0001) + ((1 - (35/36))*(1 - 0.0001))
    #= 0.02787222

4a Extra Credit (10 Points) Why is this a bad parody of frequentist statistics?

The frequentist compares the probability of the machine lying to a p-value. They say b/c it’s pvalue is <0.05 the sun has exploded. But the p-value is a completely different thing then the probability ratio, which is why it’s a bad parody of frequentist statistics.

  1. Quailing at the Prospect of Linear Models I’d like us to walk through the three different ‘engines’ that we have learned about to fit linear models. To motivate this, we’ll look at Burness et al.’s 2012 study "Post-hatch heat warms adult beaks: irreversible physiological plasticity in Japanese quail http://rspb.royalsocietypublishing.org/content/280/1767/20131436.short the data for which they have made available at Data Dryad at http://datadryad.org/resource/doi:10.5061/dryad.gs661. We’ll be looking at the morphology data.
setwd("/Users/wendiwhite/Desktop/Ecology/UMass Boston Masters/UMass Masters classes/bio stats 607/data")
morphology <- read.csv("Morphology data.csv") %>% #load in morph data w/out na's
  na.omit()
morphology
##     bird_num Sex age_days exp_temp_C mass_g tarsus_mm culmen_mm depth_mm
## 1          1            5         15  16.09     19.38      7.64     4.23
## 2          2   m        5         15  19.22     20.38      7.49     4.46
## 3          3   f        5         15  17.51     19.04      7.31     3.92
## 4          4   m        5         15  14.36     20.11      7.34     3.85
## 5          5   f        5         15  17.43     21.82      8.24     4.42
## 6          6   m        5         15  15.66     19.83      6.82     3.65
## 7          7   m        5         15  20.00     21.65      7.84     3.94
## 8          8            5         15  16.42     18.19      7.39     3.72
## 9          9   m        5         15  17.40     19.40      7.40     4.50
## 10        10   f        5         15  20.04     20.91      7.81     4.09
## 11        11   f        5         15  16.19     19.97      6.86     4.61
## 12        12   m        5         15  16.20     18.59      7.83     4.61
## 13        13   f        5         15  17.10     18.67      6.85     4.97
## 14        14   f        5         15  15.62     20.64      6.60     4.15
## 15        15   m        5         15  14.94     18.23      6.47     5.11
## 16        16   f        5         15  23.44     21.59      8.66     4.43
## 17        17   f        5         15  17.31     20.24      7.43     4.31
## 18        18   m        5         15  12.51     17.66      6.84     4.34
## 19        19   f        5         15  15.82     20.36      7.39     4.63
## 20        20   m        5         15  16.47     18.00      8.12     4.21
## 21        21   m        5         30  17.32     19.58      7.21     3.85
## 22        22   f        5         30  20.77     22.34      8.74     4.08
## 23        23   m        5         30  15.53     18.52      6.70     3.98
## 24        24   m        5         30  14.78     19.22      6.66     4.50
## 25        25   m        5         30  15.41     20.23      8.46     4.99
## 26        26   m        5         30  13.56     19.18      6.98     4.92
## 27        27   f        5         30  13.46     17.93      6.86     4.23
## 28        28   f        5         30  17.13     20.74      7.01     3.30
## 29        29   f        5         30  17.29     21.98      9.00     4.19
## 30        30   f        5         30  12.96     18.69      7.36     4.10
## 31        31   f        5         30  17.72     20.47      7.76     4.78
## 32        32   m        5         30  13.72     20.03      6.93     3.32
## 33        33   m        5         30  16.44     19.70      7.91     4.59
## 34        34   m        5         30  16.34     19.59      7.67     4.41
## 35        35   f        5         30  16.22     19.65      7.22     4.76
## 36        36   f        5         30  15.56     19.82      7.26     4.80
## 37        37            5         30  14.48     20.92      7.87     4.35
## 38        38   m        5         30  17.83     19.27      6.64     4.48
## 39        39   f        5         30  15.09     19.39      7.06     4.81
## 40        40            5         30  12.94     19.02      6.60     3.93
## 41         1            7         15  13.84     18.52      7.88     3.54
## 42         2   m        7         15  23.15     19.89      7.95     5.85
## 43         3   f        7         15  21.68     20.63      7.75     4.48
## 44         4   m        7         15  17.44     19.92      6.82     4.46
## 45         5   f        7         15  19.89     19.22      7.55     4.16
## 46         6   m        7         15  17.54     19.42      4.23     4.66
## 47         7   m        7         15  26.82     23.24      8.44     5.48
## 48         8            7         15  18.33     18.76      7.49     5.55
## 49         9   m        7         15  21.39     21.25      7.80     4.30
## 50        10   f        7         15  23.99     20.22      8.50     4.61
## 51        11   f        7         15  18.36     19.06      8.00     3.90
## 52        12   m        7         15  23.32     22.59      8.51     5.40
## 53        13   f        7         15  19.38     20.44      7.49     4.32
## 54        14   f        7         15  16.93     16.19      8.73     4.99
## 55        15   m        7         15  13.76     18.78      6.78     5.60
## 56        16   f        7         15  29.45     23.39      8.82     4.22
## 57        17   f        7         15  20.80     22.31      7.47     4.75
## 58        18   m        7         15  13.67     18.22      6.68     6.92
## 59        19   f        7         15  19.21     21.74      7.38     4.71
## 60        20   m        7         15  18.40     19.91      7.38     4.09
## 61        21   m        7         30  21.38     19.69      7.18     4.27
## 62        22   f        7         30  27.04     22.59      7.91     5.76
## 63        23   m        7         30  16.66     19.83      6.79     4.72
## 64        24   m        7         30  17.42     18.44      6.86     4.55
## 65        25   m        7         30  15.80     21.36      7.89     4.45
## 66        26   m        7         30  14.60     18.81      6.10     3.72
## 67        27   f        7         30  15.62     19.22      7.90     4.82
## 68        28   f        7         30  17.82     18.59      6.78     4.33
## 69        29   f        7         30  19.22     21.98      8.82     5.61
## 70        30   f        7         30  15.55     19.15      7.56     4.16
## 71        31   f        7         30  19.03     21.61      7.61     4.63
## 72        32   m        7         30  17.42     20.25      7.79     4.00
## 73        33   m        7         30  17.01     20.22      8.33     4.12
## 74        34   m        7         30  18.78     20.09      7.11     4.38
## 75        35   f        7         30  20.96     20.51      7.27     5.82
## 76        36   f        7         30  18.90     19.70      7.80     4.87
## 77        37            7         30  17.33     20.79      7.49     4.19
## 78        38   m        7         30  20.44     22.07      7.21     4.12
## 79        39   f        7         30  19.78     18.90      9.86     4.84
## 81         1            9         15  17.48     21.60      6.84     4.11
## 82         2   m        9         15  29.88     22.92      8.46     5.15
## 83         3   f        9         15  27.50     22.62      7.96     4.90
## 84         4   m        9         15  21.63     23.31      8.15     4.31
## 85         5   f        9         15  35.06     25.57      8.38     5.20
## 86         6   m        9         15  24.57     22.90      8.34     4.66
## 87         7   m        9         15  36.20     23.52      8.49     4.99
## 88         8            9         15  22.65     19.77      7.82     5.43
## 89         9   m        9         15  31.22     21.43      8.30     4.23
## 90        10   f        9         15  34.84     22.64      9.11     4.25
## 91        11   f        9         15  29.30     22.99      8.05     4.09
## 92        12   m        9         15  24.49     21.40      7.21     4.72
## 93        13   f        9         15  25.63     22.13      8.45     4.71
## 94        14   f        9         15  25.36     23.44      8.28     4.88
## 95        15   m        9         15  19.76     20.69      9.10     4.99
## 96        16   f        9         15  38.91     23.77      9.16     4.87
## 97        17   f        9         15  28.12     22.39      7.29     4.71
## 98        18   m        9         15  20.53     19.80      6.04     3.95
## 99        19   f        9         15  30.00     25.40      7.69     4.19
## 100       20   m        9         15  26.54     20.41      8.99     4.36
## 101       21   m        9         30  31.89     22.27      7.81     3.78
## 102       22   f        9         30  38.62     25.12      8.86     5.08
## 103       23   m        9         30  35.36     29.99      8.04     4.81
## 104       24   m        9         30  26.88     23.02      7.96     4.88
## 105       25   m        9         30  23.00     21.82      7.83     5.01
## 106       26   m        9         30  22.80     21.13      7.94     4.29
## 107       27   f        9         30  23.96     22.35      8.13     4.15
## 108       28   f        9         30  29.65     22.96      8.64     4.92
## 109       29   f        9         30  25.71     24.41      8.74     4.25
## 110       30   f        9         30  21.80     21.23      8.04     4.71
## 111       31   f        9         30  29.67     23.64      7.74     4.78
## 112       32   m        9         30  27.06     24.15      7.75     4.70
## 113       33   m        9         30  29.08     20.45      7.97     4.46
## 114       34   m        9         30  28.00     21.85      8.16     4.37
## 115       35   f        9         30  30.00     23.16      8.92     5.17
## 116       36   f        9         30  25.40     21.74      7.64     5.13
## 117       37            9         30  23.54     22.22      7.64     4.58
## 118       38   m        9         30  30.86     24.72      8.33     5.25
## 119       39   f        9         30  31.08     22.74      6.68     5.51
## 121        1           11         15  18.58     20.02      7.97     4.43
## 122        2   m       11         15  42.83     24.15      8.23     4.38
## 123        3   f       11         15  35.46     22.87      9.77     5.38
## 124        4   m       11         15  29.60     23.30      7.90     4.92
## 125        5   f       11         15  51.10     27.93      9.48     6.42
## 126        6   m       11         15  33.09     23.61      8.94     5.23
## 127        7   m       11         15  46.00     26.09      8.66     5.06
## 128        8           11         15  28.95     20.62      8.62     5.90
## 129        9   m       11         15  41.19     25.13      8.64     5.92
## 130       10   f       11         15  46.39     25.18      8.94     5.54
## 131       11   f       11         15  40.99     24.82      8.08     4.94
## 132       12   m       11         15  31.90     22.85      9.15     4.36
## 133       13   f       11         15  33.69     22.89      8.75     4.88
## 134       14   f       11         15  35.74     24.21      9.14     4.77
## 135       15   m       11         15  27.59     22.02      8.77     5.58
## 136       16   f       11         15  50.13     26.34      9.26     5.44
## 137       17   f       11         15  37.25     24.43      8.45     5.15
## 138       18   m       11         15  28.56     22.41      8.50     5.15
## 139       19   f       11         15  42.53     24.11      9.40     4.28
## 140       20   m       11         15  37.88     34.13      9.36     6.01
## 141       21   m       11         30  43.26     27.10      7.70     5.03
## 142       22   f       11         30  52.44     26.05      9.09     5.05
## 143       23   m       11         30  33.83     23.39      7.85     5.08
## 144       24   m       11         30  36.55     25.33      7.47     5.12
## 145       25   m       11         30  31.38     23.69      8.90     5.40
## 146       26   m       11         30  31.48     22.38      7.94     5.54
## 147       27   f       11         30  32.02     24.18      7.91     4.60
## 148       28   f       11         30  41.45     25.41      8.91     4.92
## 149       29   f       11         30  33.89     26.71      9.50     5.42
## 150       30   f       11         30  29.26     23.09      7.69     5.13
## 151       31   f       11         30  37.51     25.20      8.57     4.83
## 152       32   m       11         30  35.32     23.15      7.36     4.23
## 153       33   m       11         30  36.98     24.69      7.95     4.75
## 154       34   m       11         30  36.17     24.48      7.62     4.48
## 155       35   f       11         30  39.19     24.92      9.12     5.99
## 156       36   f       11         30  34.58     24.07      8.97     4.85
## 157       37           11         30  31.37     22.74      7.39     6.61
## 158       38   m       11         30  41.11     25.26      8.72     5.68
## 159       39   f       11         30  44.55     21.42      8.98     5.66
## 162        2   m       13         15  51.38     27.13      9.33     4.43
## 163        3   f       13         15  44.06     24.51      9.39     6.06
## 164        4   m       13         15  38.56     22.38      8.79     4.73
## 165        5   f       13         15  62.35     29.34     11.06     5.66
## 166        6   m       13         15  38.49     23.98      9.52     5.85
## 167        7   m       13         15  57.57     27.83      9.88     5.28
## 169        9   m       13         15  51.60     27.20      9.11     5.46
## 170       10   f       13         15  56.98     26.57      8.82     4.90
## 171       11   f       13         15  48.86     25.10      8.87     5.77
## 172       12   m       13         15  37.26     24.31      9.00     5.15
## 173       13   f       13         15  39.76     24.39      7.43     4.65
## 174       14   f       13         15  42.61     24.70      8.68     4.63
## 175       15   m       13         15  30.62     23.83      8.72     4.73
## 176       16   f       13         15  62.45     28.77     10.60     6.42
## 177       17   f       13         15  45.28     25.71     10.11     6.16
## 178       18   m       13         15  32.18     21.30      9.00     4.78
## 179       19   f       13         15  51.91     27.05      9.64     4.95
## 180       20   m       13         15  44.30     22.45      9.95     4.50
## 181       21   m       13         30  50.90     26.94      9.37     5.36
## 182       22   f       13         30  64.58     30.60      9.72     5.09
## 183       23   m       13         30  42.86     27.14      9.96     4.55
## 184       24   m       13         30  42.50     26.62      8.99     5.33
## 185       25   m       13         30  40.74     26.40     10.77     5.67
## 186       26   m       13         30  39.90     25.84      8.74     4.50
## 187       27   f       13         30  38.63     22.70      8.82     5.80
## 188       28   f       13         30  48.38     27.75     10.03     5.18
## 189       29   f       13         30  43.77     27.94      9.72     4.45
## 190       30   f       13         30  37.29     24.94      8.55     4.64
## 191       31   f       13         30  45.43     25.41      8.88     5.40
## 192       32   m       13         30  43.38     27.94      8.93     5.48
## 193       33   m       13         30  42.32     27.34      7.64     6.18
## 194       34   m       13         30  37.48     26.35      9.27     4.99
## 195       35   f       13         30  47.86     24.73     10.25     5.86
## 196       36   f       13         30  42.31     26.02      9.33     5.57
## 197       37           13         30  39.06     26.41      9.51     5.37
## 198       38   m       13         30  50.30     26.02      8.06     5.70
## 199       39   f       13         30  53.87     27.73      8.84     5.20
## 202        2   m       15         15  62.14     27.73      8.92     5.25
## 203        3   f       15         15  49.66     26.25      8.09     5.69
## 204        4   m       15         15  45.86     26.17      9.67     4.84
## 205        5   f       15         15  72.02     29.74     10.03     5.47
## 206        6   m       15         15  46.00     24.95      9.69     4.50
## 207        7   m       15         15  64.85     29.49     10.60     5.28
## 209        9   m       15         15  57.05     28.18      9.64     4.88
## 210       10   f       15         15  66.50     27.42     10.04     5.25
## 211       11   f       15         15  55.07     26.32      9.33     4.95
## 212       12   m       15         15  40.75     24.40      9.55     5.31
## 213       13   f       15         15  43.29     25.30      8.88     6.30
## 214       14   f       15         15  45.68     27.24      9.33     4.40
## 215       15   m       15         15  28.89     20.73      9.29     5.47
## 216       16   f       15         15  70.60     31.46     11.42     6.45
## 217       17   f       15         15  48.26     26.42     10.83     5.32
## 218       18   m       15         15  35.24     23.01      8.94     4.07
## 219       19   f       15         15  55.36     27.71      9.90     5.83
## 220       20   m       15         15  46.74     25.21      9.72     5.05
## 221       21   m       15         30  58.29     27.70      8.96     4.55
## 222       22   f       15         30  73.00     30.36      9.11     5.50
## 223       23   m       15         30  48.15     26.14     10.27     5.75
## 224       24   m       15         30  47.35     27.06      9.21     4.68
## 225       25   m       15         30  44.86     27.17     11.35     5.28
## 226       26   m       15         30  46.19     24.61      9.24     4.75
## 227       27   f       15         30  42.86     28.19      9.56     4.79
## 228       28   f       15         30  51.89     26.81      9.98     5.73
## 229       29   f       15         30  50.15     29.01      9.78     5.13
## 230       30   f       15         30  38.93     26.54     10.67     4.96
## 231       31   f       15         30  50.09     25.00      9.59     5.45
## 232       32   m       15         30  50.15     27.89      9.50     4.64
## 233       33   m       15         30  49.66     27.07      9.86     6.46
## 234       34   m       15         30  52.99     28.24     10.83     5.99
## 235       35   f       15         30  53.52     28.60     10.78     4.93
## 236       36   f       15         30  49.99     25.26     10.11     4.60
## 238       38   m       15         30  57.22     26.19      8.92     6.17
## 239       39   f       15         30  64.16     28.89      9.26     5.30
## 242        2   m       17         15  77.59     29.65     10.91     5.36
## 243        3   f       17         15  63.94     27.77      9.68     4.98
## 244        4   m       17         15  58.52     30.42      9.80     5.60
## 245        5   f       17         15  92.61     31.95     12.78     5.51
## 246        6   m       17         15  61.68     26.66     10.92     5.01
## 247        7   m       17         15  81.63     31.36     10.91     5.73
## 249        9   m       17         15  71.42     30.03     10.75     5.74
## 250       10   f       17         15  86.16     31.61     10.86     6.50
## 251       11   f       17         15  73.48     29.58     10.45     6.35
## 252       12   m       17         15  54.77     25.23      9.23     5.51
## 253       13   f       17         15  57.90     25.61      9.64     5.24
## 254       14   f       17         15  60.89     28.37     10.42     5.20
## 255       15   m       17         15  38.69     24.32     10.14     4.82
## 256       16   f       17         15  93.84     27.71     11.08     6.27
## 257       17   f       17         15  64.38     27.28     10.95     5.00
## 258       18   m       17         15  49.82     25.84      9.50     4.59
## 259       19   f       17         15  75.38     29.00     10.92     4.25
## 260       20   m       17         15  66.48     27.81      9.05     5.52
## 261       21   m       17         30  73.41     27.06      9.98     4.74
## 262       22   f       17         30  90.02     34.04     10.50     5.70
## 263       23   m       17         30  59.88     27.05     10.88     5.89
## 264       24   m       17         30  60.68     28.40      9.27     5.80
## 265       25   m       17         30  56.38     29.15     13.60     5.88
## 266       26   m       17         30  60.20     27.72      9.99     5.18
## 267       27   f       17         30  53.68     25.56     10.27     4.95
## 268       28   f       17         30  63.67     27.80     10.29     5.26
## 269       29   f       17         30  65.80     29.75     10.50     5.24
## 270       30   f       17         30  54.35     27.70     10.66     4.11
## 271       31   f       17         30  62.24     26.19     10.04     5.63
## 272       32   m       17         30  61.87     28.81     10.31     5.67
## 273       33   m       17         30  60.03     27.24     10.31     6.09
## 274       34   m       17         30  67.11     28.49     10.36     5.10
## 275       35   f       17         30  68.79     29.07     10.75     5.91
## 276       36   f       17         30  61.42     27.07      9.31     4.90
## 278       38   m       17         30  72.33     31.30      9.31     5.89
## 279       39   f       17         30  77.48     31.87      8.15     5.19
## 282        2   m       19         15  91.97     30.99     10.85     5.27
## 283        3   f       19         15  70.12     28.14      8.78     5.28
## 284        4   m       19         15  66.65     28.00     11.07     4.62
## 285        5   f       19         15 104.14     34.00     12.30     5.69
## 286        6   m       19         15  75.47     39.44     10.97     5.61
## 287        7   m       19         15  92.56     27.88      7.42     6.19
## 289        9   m       19         15  79.01     29.86      9.42     6.18
## 290       10   f       19         15 100.44     32.56     10.68     5.66
## 291       11   f       19         15  86.62     31.32      9.88     5.92
## 292       12   m       19         15  63.05     26.08     10.20     5.45
## 293       13   f       19         15  65.95     25.72      9.05     4.81
## 294       14   f       19         15  70.02     27.41     10.39     6.12
## 295       15   m       19         15  48.13     24.62      9.63     5.11
## 296       16   f       19         15 106.90     34.26     11.28     4.97
## 297       17   f       19         15  70.89     29.20     11.09     4.80
## 298       18   m       19         15  56.90     27.37     10.47     4.86
## 299       19   f       19         15  88.75     28.59     11.28     4.61
## 300       20   m       19         15  79.28     27.11     10.27     5.26
## 301       21   m       19         30  85.11     28.78      9.75     5.03
## 302       22   f       19         30 105.22     32.98     11.68     5.10
## 303       23   m       19         30  68.03     32.38     10.28     5.61
## 304       24   m       19         30  71.40     28.00      9.52     4.76
## 305       25   m       19         30  68.92     29.60     12.97     5.19
## 306       26   m       19         30  72.84     29.55     11.92     5.29
## 307       27   f       19         30  66.10     28.50      8.93     4.18
## 308       28   f       19         30  74.60     26.73     10.47     6.54
## 309       29   f       19         30  77.50     31.81     11.14     5.13
## 310       30   f       19         30  65.83     28.33      9.70     5.31
## 311       31   f       19         30  73.45     29.43      9.61     5.50
## 312       32   m       19         30  72.95     28.16     10.59     6.19
## 313       33   m       19         30  68.16     29.99     10.82     5.59
## 314       34   m       19         30  81.28     28.91     11.92     5.52
## 315       35   f       19         30  79.24     30.70     10.40     5.11
## 316       36   f       19         30  78.81     28.98      9.37     5.41
## 318       38   m       19         30  85.25     32.60     10.38     5.50
## 319       39   f       19         30  91.93     34.10      9.87     5.81
## 322        2   m       21         15 106.38     31.74     11.57     5.35
## 323        3   f       21         15  86.80     29.80      9.88     5.35
## 324        4   m       21         15  82.83     31.43     10.60     5.08
## 325        5   f       21         15 122.70     36.06     14.00     7.67
## 326        6   m       21         15  89.48     30.91     12.07     5.73
## 327        7   m       21         15 108.90     34.53     10.44     6.44
## 329        9   m       21         15  98.14     30.18     10.55     5.62
## 330       10   f       21         15 113.80     33.45     12.81     5.44
## 331       11   f       21         15 101.16     31.97     10.01     4.69
## 332       12   m       21         15  76.15     28.17     10.91     5.10
## 333       13   f       21         15  80.31     28.02     10.06     5.18
## 334       14   f       21         15  87.14     31.58     10.45     5.48
## 335       15   m       21         15  60.50     26.00     10.52     5.69
## 336       16   f       21         15 128.91     33.63     11.76     5.81
## 337       17   f       21         15  84.35     31.70     12.64     5.08
## 338       18   m       21         15  67.33     27.24     10.17     4.40
## 339       19   f       21         15  96.79     31.07      7.08     5.26
## 340       20   m       21         15  83.36     31.95     10.36     5.44
## 341       21   m       21         30  97.06     30.97     11.09     5.00
## 342       22   f       21         30 123.78     34.38     12.98     5.82
## 343       23   m       21         30  84.25     30.46     11.74     6.11
## 344       24   m       21         30  84.27     31.63     11.00     5.04
## 345       25   m       21         30  78.55     35.39     11.70     4.89
## 346       26   m       21         30  87.60     30.22     11.99     6.91
## 347       27   f       21         30  78.86     30.25     11.67     5.52
## 348       28   f       21         30  86.26     30.03     11.15     5.28
## 349       29   f       21         30  94.11     34.98     11.55     6.01
## 350       30   f       21         30  80.51     31.52     10.04     5.07
## 351       31   f       21         30  87.88     29.59     10.40     4.90
## 352       32   m       21         30  85.67     29.18     10.90     4.98
## 353       33   m       21         30  82.74     32.28     11.22     5.51
## 354       34   m       21         30  98.22     35.07     12.38     5.70
## 355       35   f       21         30  96.80     28.98     12.39     5.38
## 356       36   f       21         30  86.28     29.27     11.08     5.35
## 358       38   m       21         30  98.33     33.98     10.89     5.85
## 359       39   f       21         30 104.78     34.12     11.39     6.14
## 362        2   m       23         15 126.81     34.58     10.41     5.00
## 363        3   f       23         15 103.26     23.52     10.88     5.40
## 364        4   m       23         15  98.18     30.78     11.52     5.11
## 365        5   f       23         15 143.30     36.36     12.72     5.80
## 366        6   m       23         15 107.94     32.68     12.17     5.11
## 367        7   m       23         15 129.69     35.19     13.47     6.11
## 369        9   m       23         15 117.33     33.00      9.85     5.41
## 370       10   f       23         15 133.46     35.19     12.33     5.36
## 371       11   f       23         15 116.80     38.67     10.13     5.47
## 372       12   m       23         15  91.88     29.62     10.24     5.42
## 373       13   f       23         15  92.84     29.19     10.56     5.79
## 374       14   f       23         15 106.32     31.56     12.13     5.37
## 375       15   m       23         15  77.28     24.70     12.19     5.93
## 376       16   f       23         15 148.20     35.45     13.44     5.71
## 377       17   f       23         15 103.11     28.30     11.18     5.45
## 378       18   m       23         15  82.78     30.10     11.08     4.73
## 379       19   f       23         15 117.40     34.85     12.52     5.34
## 380       20   m       23         15 110.82     32.30     11.15     5.40
## 381       21   m       23         30 110.73     32.81     12.69     6.34
## 382       22   f       23         30 138.97     35.28     12.51     5.97
## 383       23   m       23         30  97.38     32.30     12.29     5.31
## 384       24   m       23         30  95.95     33.80     12.33     5.62
## 385       25   m       23         30  94.82     35.78     13.40     5.58
## 386       26   m       23         30  97.09     33.39     13.37     4.64
## 387       27   f       23         30  91.00     31.54     11.45     6.05
## 388       28   f       23         30  99.82     34.88     13.30     5.52
## 389       29   f       23         30 112.04     36.71     13.13     6.52
## 390       30   f       23         30  94.24     31.58     12.36     5.49
## 391       31   f       23         30 101.80     34.18     11.61     5.99
## 392       32   m       23         30  98.26     33.35     11.22     5.04
## 393       33   m       23         30  96.10     31.72     10.15     6.05
## 394       34   m       23         30 115.15     34.36     13.92     5.26
## 395       35   f       23         30 107.20     32.98     12.29     6.40
## 396       36   f       23         30 102.00     34.37     10.86     5.45
## 398       38   m       23         30 118.98     34.61     12.64     5.75
## 399       39   f       23         30 129.44     35.62     11.80     6.77
## 402        2   m       25         15 144.22     36.00     11.00     5.37
## 403        3   f       25         15 118.47     32.67     10.63     5.18
## 404        4   m       25         15 113.28     33.32     11.88     5.60
## 405        5   f       25         15 159.53     38.21     14.30     6.61
## 406        6   m       25         15 121.62     32.76     12.43     6.00
## 407        7   m       25         15 145.70     36.49     12.90     6.79
## 409        9   m       25         15 135.05     35.60     11.62     6.19
## 410       10   f       25         15 145.65     34.16     13.38     5.96
## 411       11   f       25         15 129.14     34.42     10.57     5.60
## 412       12   m       25         15 106.94     30.95     12.63     5.46
## 413       13   f       25         15 110.50     30.92      7.44     5.11
## 414       14   f       25         15 124.00     32.20     12.46     5.29
## 415       15   m       25         15  90.56     30.05     10.73     6.49
## 416       16   f       25         15 165.65     35.43     12.41     6.63
## 417       17   f       25         15 121.32     31.92     13.55     5.99
## 418       18   m       25         15 100.28     30.11     11.08     5.64
## 419       19   f       25         15 137.36     37.34     11.14     5.18
## 420       20   m       25         15 127.72     32.67     11.79     6.14
## 421       21   m       25         30 128.20     35.08     11.77     5.99
## 422       22   f       25         30 155.49     36.00     13.69     6.19
## 423       23   m       25         30 109.15     35.52     12.92     5.86
## 424       24   m       25         30 108.06     35.04     10.49     5.39
## 425       25   m       25         30 105.24     35.44     14.47     5.79
## 426       26   m       25         30 109.60     34.55     12.99     5.92
## 427       27   f       25         30 103.80     32.38     12.91     5.30
## 428       28   f       25         30 112.15     34.84     10.91     5.76
## 429       29   f       25         30 126.95     40.78     12.86     5.98
## 430       30   f       25         30 103.73     35.69     12.99     5.74
## 431       31   f       25         30 111.72     34.46     12.42     6.73
## 432       32   m       25         30 111.72     36.73     12.64     5.14
## 433       33   m       25         30 106.52     35.00     12.41     5.82
## 434       34   m       25         30 128.00     37.41     12.20     6.23
## 435       35   f       25         30 120.28     37.28     12.47     6.63
## 436       36   f       25         30 112.06     35.30     11.75     6.23
## 438       38   m       25         30 130.28     36.30     12.53     5.88
## 439       39   f       25         30 139.94     36.11     12.13     6.74
## 442        2   m       27         15 172.97     37.10     12.51     5.50
## 443        3   f       27         15 142.03     34.70     11.93     6.19
## 444        4   m       27         15 133.51     33.34     11.04     4.91
## 445        5   f       27         15 179.04     39.89     14.17     5.79
## 446        6   m       27         15 142.86     37.05     13.24     5.44
## 447        7   m       27         15 167.26     38.85     12.53     5.73
## 449        9   m       27         15 159.45     35.71     12.77     6.70
## 450       10   f       27         15 160.00     34.23     13.11     6.18
## 451       11   f       27         15 154.20     36.61     12.99     5.00
## 452       12   m       27         15 124.89     32.45     12.42     6.08
## 453       13   f       27         15 131.87     32.11     12.75     5.75
## 454       14   f       27         15 146.77     35.40     10.47     4.99
## 455       15   m       27         15 109.24     31.95     11.65     4.90
## 456       16   f       27         15 186.97     37.62     14.29     5.04
## 457       17   f       27         15 137.40     33.38     13.79     6.11
## 458       18   m       27         15 120.04     32.13     13.19     5.86
## 459       19   f       27         15 158.94     36.61     13.91     5.79
## 460       20   m       27         15 145.10     29.92     13.10     6.09
## 461       21   m       27         30 146.78     35.02     11.74     6.23
## 462       22   f       27         30 179.29     38.59     14.51     6.46
## 463       23   m       27         30 125.84     34.02     11.76     5.45
## 464       24   m       27         30 125.14     36.05     11.33     5.84
## 465       25   m       27         30 120.60     36.81     12.38     6.47
## 466       26   m       27         30 120.77     36.43     13.06     5.92
## 467       27   f       27         30 127.43     32.59     13.42     5.83
## 468       28   f       27         30 135.12     34.55     11.92     5.38
## 469       29   f       27         30 144.25     42.01     14.48     6.11
## 470       30   f       27         30 117.26     36.45     13.01     6.52
## 471       31   f       27         30 125.00     36.90     13.54     5.54
## 472       32   m       27         30 128.07     34.44     12.75     5.68
## 473       33   m       27         30 121.56     34.89     14.25     5.65
## 474       34   m       27         30 151.44     35.69     12.99     6.51
## 475       35   f       27         30 136.60     33.58     12.44     6.81
## 476       36   f       27         30 130.00     34.71     13.20     5.02
## 478       38   m       27         30 144.33     36.33     13.15     6.14
## 479       39   f       27         30 160.00     37.26     12.80     5.43
## 482        2   m       29         15 174.78     35.39     11.71     6.33
## 483        3   f       29         15 155.30     30.80     12.38     6.81
## 484        4   m       29         15 147.80     36.20     13.27     6.30
## 485        5   f       29         15 192.08     37.34     15.62     6.80
## 486        6   m       29         15 157.75     35.17     14.27     6.49
## 487        7   m       29         15 183.44     37.70     13.88     5.46
## 489        9   m       29         15 178.69     38.61     14.39     5.89
## 490       10   f       29         15 152.09     35.59     13.23     5.32
## 491       11   f       29         15 174.62     37.69     11.44     5.31
## 492       12   m       29         15 144.80     31.49     12.75     5.58
## 493       13   f       29         15 156.94     33.83     12.31     6.61
## 494       14   f       29         15 165.83     36.80     13.27     6.03
## 495       15   m       29         15 129.98     32.70     13.57     6.09
## 496       16   f       29         15 207.09     38.90     14.03     5.60
## 497       17   f       29         15 153.70     34.70     13.23     5.51
## 498       18   m       29         15 144.99     32.16     11.49     5.61
## 499       19   f       29         15 180.92     39.77     12.52     6.47
## 500       20   m       29         15 164.20     32.66     11.01     5.73
## 501       21   m       29         30 158.78     34.98     11.32     5.83
## 502       22   f       29         30 187.30     39.29     13.64     6.30
## 503       23   m       29         30 132.00     36.74     13.89     5.99
## 504       24   m       29         30 130.78     35.79     11.10     5.91
## 505       25   m       29         30 131.88     37.13     13.61     6.00
## 506       26   m       29         30 125.98     35.78     11.99     5.02
## 507       27   f       29         30 129.80     34.71     10.86     6.27
## 508       28   f       29         30 139.30     36.16     13.19     6.86
## 509       29   f       29         30 152.00     41.91     12.90     7.40
## 510       30   f       29         30 127.87     36.32     13.51     5.19
## 511       31   f       29         30 141.53     34.64     13.26     5.63
## 512       32   m       29         30 135.86     36.62     10.57     6.39
## 513       33   m       29         30 126.89     34.86     13.19     6.48
## 514       34   m       29         30 154.28     37.54     13.28     5.91
## 515       35   f       29         30 143.92     38.46     14.22     6.23
## 516       36   f       29         30 136.08     35.52     13.33     5.85
## 518       38   m       29         30 149.45     38.88     13.06     5.74
## 519       39   f       29         30 165.59     37.08     13.41     6.26
## 522        2   m       34         15 217.89     40.10     14.03     5.80
## 523        3   f       34         15 172.94     34.09     12.35     5.82
## 524        4   m       34         15 168.72     36.41     13.17     5.34
## 525        5   f       34         15 221.42     39.86     15.48     6.51
## 526        6   m       34         15 174.98     37.36     13.81     6.51
## 527        7   m       34         15 197.00     37.07     14.42     6.24
## 529        9   m       34         15 203.75     39.06     13.64     6.56
## 530       10   f       34         15 192.59     39.10     14.01     5.61
## 531       11   f       34         15 188.72     37.20     13.37     5.31
## 532       12   m       34         15 157.54     34.81     12.31     5.41
## 533       13   f       34         15 172.74     34.68     12.96     6.02
## 534       14   f       34         15 191.32     38.87     14.05     5.24
## 535       15   m       34         15 147.63     34.09     10.95     6.40
## 536       16   f       34         15 230.42     38.40     15.14     7.09
## 537       17   f       34         15 186.52     39.24     15.45     6.01
## 538       18   m       34         15 164.21     34.03     13.73     6.36
## 539       19   f       34         15 203.25     41.33     14.31     6.12
## 540       20   m       34         15 187.92     36.14     12.73     5.58
## 541       21   m       34         30 170.06     36.33     11.98     5.04
## 542       22   f       34         30 205.64     39.61     13.49     6.41
## 543       23   m       34         30 162.19     39.64     14.12     6.14
## 544       24   m       34         30 157.52     39.11     12.63     5.53
## 545       25   m       34         30 160.30     35.48     14.19     6.03
## 546       26   m       34         30 161.33     39.36     13.90     6.06
## 547       27   f       34         30 157.61     37.33     14.58     5.72
## 548       28   f       34         30 166.56     36.13     14.59     5.82
## 549       29   f       34         30 185.92     44.64     17.38     6.24
## 550       30   f       34         30 159.53     38.92     15.15     6.24
## 551       31   f       34         30 171.42     37.94     13.59     6.53
## 552       32   m       34         30 166.08     38.98     15.35     5.74
## 553       33   m       34         30 152.98     37.29     12.20     6.10
## 554       34   m       34         30 188.97     38.51     14.14     5.84
## 555       35   f       34         30 170.00     40.79     14.69     7.17
## 556       36   f       34         30 155.49     40.18     13.39     6.58
## 558       38   m       34         30 167.48     37.97     11.11     5.93
## 559       39   f       34         30 193.52     37.90     14.15     6.03
## 562        2   m       39         15 250.72     39.23     13.49     5.89
## 563        3   f       39         15 197.17     35.67     11.24     6.41
## 564        4   m       39         15 189.35     38.19     12.99     5.44
## 565        5   f       39         15 240.25     36.81     16.35     6.35
## 566        6   m       39         15 199.35     37.74     14.22     6.18
## 567        7   m       39         15 207.70     38.18     14.59     6.14
## 569        9   m       39         15 232.44     39.74     13.56     5.74
## 570       10   f       39         15 206.61     35.70     15.70     6.94
## 571       11   f       39         15 206.20     35.84     12.33     6.16
## 572       12   m       39         15 181.05     34.74     12.38     6.12
## 573       13   f       39         15 195.79     33.38     13.77     5.91
## 574       14   f       39         15 232.40     39.98     13.59     5.65
## 575       15   m       39         15 174.33     36.63     13.15     4.95
## 576       16   f       39         15 249.57     36.90     15.43     5.92
## 577       17   f       39         15 224.88     37.65     13.92     5.86
## 578       18   m       39         15 187.59     34.59     12.33     5.75
## 579       19   f       39         15 227.14     40.43     14.04     5.71
## 580       20   m       39         15 206.87     37.17     13.54     6.04
## 581       21   m       39         30 176.74     34.52     12.99     5.49
## 582       22   f       39         30 224.73     39.84     16.50     6.26
## 583       23   m       39         30 180.39     37.00     14.28     5.58
## 584       24   m       39         30 175.55     34.57     13.83     5.12
## 585       25   m       39         30 172.14     36.41     15.88     5.29
## 586       26   m       39         30 183.40     39.28     12.86     5.78
## 587       27   f       39         30 158.27     37.62     15.11     4.89
## 588       28   f       39         30 183.66     37.33     15.00     6.27
## 589       29   f       39         30 200.13     44.71     16.59     6.31
## 590       30   f       39         30 184.42     41.49     13.41     5.89
## 591       31   f       39         30 191.66     37.76     14.45     6.18
## 592       32   m       39         30 177.98     38.56     14.88     5.90
## 593       33   m       39         30 175.11     38.77     14.70     6.76
## 594       34   m       39         30 210.93     38.37     12.15     5.63
## 595       35   f       39         30 190.73     37.37     15.34     6.53
## 596       36   f       39         30 175.79     35.65     13.90     5.15
## 598       38   m       39         30 188.80     39.78     15.31     5.54
## 599       39   f       39         30 200.63     39.88     14.60     5.54
## 602        2   m       44         15 248.81     39.60     13.84     5.78
## 603        3   f       44         15 217.58     35.36     12.74     5.37
## 604        4   m       44         15 206.11     37.81     13.96     5.78
## 605        5   f       44         15 253.86     39.53     15.21     6.31
## 606        6   m       44         15 218.44     38.78     14.60     6.54
## 607        7   m       44         15 225.64     39.09     14.62     6.19
## 609        9   m       44         15 260.14     40.82     13.22     5.87
## 610       10   f       44         15 223.25     36.66     15.03     5.59
## 611       11   f       44         15 226.18     36.42     14.06     5.58
## 612       12   m       44         15 202.09     34.83     13.08     5.68
## 613       13   f       44         15 213.48     33.98     11.92     5.92
## 614       14   f       44         15 245.90     38.58     11.57     5.86
## 615       15   m       44         15 198.00     37.12     14.70     6.21
## 616       16   f       44         15 264.88     36.65     15.24     6.48
## 617       17   f       44         15 258.14     36.33     15.42     5.90
## 618       18   m       44         15 215.88     33.56     14.80     5.40
## 619       19   f       44         15 249.33     41.03     15.76     6.89
## 620       20   m       44         15 228.56     31.68     12.77     5.15
## 621       21   m       44         30 207.10     38.95     12.85     5.96
## 622       22   f       44         30 233.02     40.54     14.82     6.53
## 623       23   m       44         30 189.85     40.32     15.01     5.97
## 624       24   m       44         30 190.14     36.44     11.86     5.08
## 625       25   m       44         30 198.86     36.67     14.88     4.98
## 626       26   m       44         30 200.26     37.97     13.18     5.57
## 627       27   f       44         30 195.82     36.55     14.57     5.26
## 628       28   f       44         30 199.68     39.13     14.76     6.28
## 629       29   f       44         30 235.86     46.32     16.32     6.45
## 630       30   f       44         30 202.71     38.45     15.71     5.45
## 631       31   f       44         30 208.44     34.12     14.29     5.67
## 632       32   m       44         30 206.13     38.05     13.59     5.22
## 633       33   m       44         30 196.49     35.90     13.33     7.61
## 634       34   m       44         30 232.00     39.61     12.69     6.08
## 635       35   f       44         30 206.39     37.90     15.30     6.19
## 636       36   f       44         30 190.96     35.96     12.92     6.29
## 638       38   m       44         30 201.66     40.71     15.08     6.70
## 639       39   f       44         30 223.82     38.41     13.86     5.16
## 642        2   m       49         15 283.60     38.82     13.68     6.22
## 643        3   f       49         15 215.60     34.23     14.41     6.53
## 644        4   m       49         15 230.97     37.23     13.67     6.17
## 645        5   f       49         15 242.40     37.41     15.90     6.97
## 646        6   m       49         15 215.90     35.54     14.61     6.86
## 647        7   m       49         15 219.51     36.90     13.08     6.67
## 649        9   m       49         15 258.53     39.96     11.95     6.00
## 650       10   f       49         15 221.27     36.66     14.38     5.99
## 651       11   f       49         15 223.24     36.92     11.22     5.08
## 652       12   m       49         15 205.19     33.23     13.94     5.46
## 653       13   f       49         15 216.00     34.98     14.47     5.94
## 654       14   f       49         15 246.62     37.38     12.18     5.91
## 655       15   m       49         15 203.07     38.57     13.02     5.05
## 656       16   f       49         15 264.18     39.98     15.00     5.92
## 657       17   f       49         15 272.72     38.33     15.19     6.53
## 658       18   m       49         15 222.32     31.73     14.08     5.54
## 659       19   f       49         15 249.86     41.30     14.01     5.57
## 660       20   m       49         15 231.10     34.03     12.17     6.02
## 661       21   m       49         30 225.40     35.66     13.74     5.73
## 662       22   f       49         30 235.91     38.12     15.57     6.06
## 663       23   m       49         30 201.85     37.18     14.48     5.55
## 664       24   m       49         30 213.87     34.22     15.54     5.62
## 665       25   m       49         30 210.80     40.59     14.85     5.26
## 666       26   m       49         30 209.26     36.42     15.04     5.91
## 667       27   f       49         30 201.57     35.77     14.72     5.60
## 668       28   f       49         30 212.04     34.99     14.08     5.55
## 669       29   f       49         30 242.60     43.73     15.51     7.01
## 670       30   f       49         30 226.92     35.69     16.29     4.92
## 671       31   f       49         30 229.12     38.32     16.76     6.79
## 672       32   m       49         30 200.64     35.05     13.74     6.10
## 673       33   m       49         30 208.44     36.76     14.10     6.30
## 674       34   m       49         30 229.05     38.27     15.62     6.06
## 675       35   f       49         30 223.75     36.12     16.02     7.30
## 676       36   f       49         30 210.81     38.81     15.60     6.06
## 678       38   m       49         30 213.56     35.78     15.22     5.48
## 679       39   f       49         30 229.65     37.25     16.24     6.02
## 682        2   m       51         15 300.00     37.23     14.47     6.60
## 683        3   f       51         15 223.45     35.72     12.53     6.07
## 684        4   m       51         15 212.84     35.89     12.72     5.75
## 685        5   f       51         15 258.76     36.45     15.30     6.50
## 686        6   m       51         15 223.75     34.02     12.55     6.59
## 687        7   m       51         15 224.78     32.09     14.17     6.67
## 689        9   m       51         15 272.30     37.31     11.96     5.59
## 690       10   f       51         15 214.13     31.54     12.30     5.36
## 691       11   f       51         15 230.05     37.86     13.65     5.67
## 692       12   m       51         15 235.60     37.28     14.60     6.36
## 693       13   f       51         15 218.11     33.93     13.61     5.58
## 694       14   f       51         15 251.83     38.75     12.26     6.60
## 695       15   m       51         15 212.82     33.02     13.73     5.55
## 696       16   f       51         15 270.04     37.80     15.49     6.17
## 697       17   f       51         15 280.75     40.06     14.16     6.03
## 698       18   m       51         15 231.80     34.70     11.44     5.95
## 699       19   f       51         15 264.00     38.49     15.16     6.75
## 700       20   m       51         15 238.32     36.80     13.36     5.84
## 701       21   m       51         30 224.06     34.54     14.31     6.18
## 702       22   f       51         30 243.85     37.81     14.41     6.44
## 703       23   m       51         30 208.60     37.04     12.91     6.10
## 704       24   m       51         30 205.55     37.07     12.92     6.36
## 705       25   m       51         30 216.43     37.80     15.00     5.86
## 706       26   m       51         30 210.10     36.49     13.60     5.66
## 707       27   f       51         30 205.86     39.15     15.85     5.96
## 708       28   f       51         30 214.85     36.58     13.39     6.66
## 709       29   f       51         30 258.06     43.08     16.42     7.40
## 710       30   f       51         30 218.36     36.13     15.41     7.58
## 711       31   f       51         30 222.30     34.16     15.56     6.60
## 712       32   m       51         30 212.17     36.04     14.05     5.83
## 713       33   m       51         30 215.82     37.71     15.00     5.76
## 714       34   m       51         30 243.19     36.79     16.68     5.46
## 715       35   f       51         30 230.75     39.80     14.02     7.35
## 716       36   f       51         30 202.89     37.73     14.86     5.76
## 718       38   m       51         30 219.18     37.17     13.73     5.92
## 719       39   f       51         30 240.80     39.33     13.94     5.88
## 722        2   m       66         15 313.35     39.45     13.84     7.22
## 723        3   f       66         15 278.90     35.58     14.32     6.24
## 724        4   m       66         15 236.60     38.70     13.06     5.60
## 725        5   f       66         15 312.77     37.35     15.39     6.66
## 726        6   m       66         15 238.78     32.17     12.70     8.02
## 727        7   m       66         15 238.52     36.23     15.13     7.24
## 729        9   m       66         15 202.34     39.10     15.18     6.16
## 730       10   f       66         15 265.84     38.06     14.90     5.84
## 731       11   f       66         15 245.58     38.60     12.64     6.71
## 732       12   m       66         15 243.32     32.58     12.83     6.28
## 733       13   f       66         15 282.37     32.70     14.93     5.84
## 734       14   f       66         15 269.85     40.75     14.08     5.52
## 735       15   m       66         15 230.36     35.33     13.14     6.06
## 736       16   f       66         15 289.70     35.48     14.87     6.78
## 737       17   f       66         15 370.68     38.05     16.16     6.65
## 738       18   m       66         15 245.09     34.90     13.08     5.36
## 739       19   f       66         15 311.78     40.98     14.49     5.79
## 740       20   m       66         15 254.78     47.52     15.45     6.81
## 741       21   m       66         30 256.27     36.62     13.60     6.34
## 742       22   f       66         30 294.21     38.83     16.27     6.48
## 743       23   m       66         30 232.59     39.16     14.12     6.44
## 744       24   m       66         30 200.65     33.68     15.03     6.47
## 745       25   m       66         30 217.38     38.00     16.20     5.31
## 746       26   m       66         30 226.22     39.45     12.51     4.99
## 747       27   f       66         30 248.14     36.56     14.75     6.19
## 748       28   f       66         30 254.29     32.58     13.66     5.30
## 749       29   f       66         30 311.70     43.36     15.70     6.60
## 750       30   f       66         30 281.47     42.04     16.83     6.26
## 751       31   f       66         30 275.33     37.35     14.45     5.90
## 752       32   m       66         30 210.13     36.95     14.28     5.58
## 753       33   m       66         30 252.11     39.15     12.45     6.83
## 754       34   m       66         30 254.25     36.86     16.07     5.64
## 755       35   f       66         30 258.18     39.23     17.67     7.93
## 756       36   f       66         30 253.89     37.15     14.55     4.83
## 758       38   m       66         30 237.70     41.04     14.16     7.04
## 759       39   f       66         30 288.04     37.56     15.88     5.79
## 762        2   m       80         15 300.00     40.06     14.30     6.54
## 764        4   m       80         15 254.35     38.35     14.81     8.41
## 765        5   f       80         15 314.00     34.45     15.72     7.28
## 766        6   m       80         15 240.77     36.71     16.02     5.78
## 767        7   m       80         15 253.45     34.86     14.53     7.65
## 769        9   m       80         15 318.00     38.60     16.26     9.52
## 770       10   f       80         15 270.05     37.70     15.00     5.95
## 771       11   f       80         15 259.48     35.24     13.99     6.97
## 772       12   m       80         15 250.00     31.28     13.26     5.74
## 773       13   f       80         15 277.36     35.69     14.89     6.52
## 774       14   f       80         15 312.65     38.99     13.97     6.16
## 775       15   m       80         15 242.29     34.39     14.76     7.36
## 777       17   f       80         15 334.68     38.55     16.82     6.54
## 778       18   m       80         15 246.26     33.78     12.67     5.85
## 779       19   f       80         15 294.60     38.81     15.29     7.62
## 780       20   m       80         15 266.39     36.34     15.65     6.08
## 781       21   m       80         30 245.18     39.50     13.95     6.02
## 782       22   f       80         30 297.74     38.29     15.96     6.59
## 783       23   m       80         30 243.70     38.29     16.41     6.31
## 784       24   m       80         30 210.40     37.14     15.84     5.46
## 785       25   m       80         30 215.14     37.63     17.30     5.28
## 786       26   m       80         30 228.50     40.10     14.22     5.77
## 787       27   f       80         30 244.05     41.64     13.22     6.70
## 788       28   f       80         30 228.80     36.58     15.42     5.66
## 789       29   f       80         30 327.46     49.42     15.15     6.99
## 790       30   f       80         30 274.58     38.52     16.50     6.64
## 791       31   f       80         30 261.37     39.52     15.79     6.02
## 792       32   m       80         30 219.82     36.55     13.12     5.89
## 793       33   m       80         30 262.89     38.41     14.74     7.05
## 794       34   m       80         30 254.42     41.48     14.90     6.04
## 795       35   f       80         30 259.30     41.36     16.15     7.30
## 796       36   f       80         30 246.17     36.26     14.22     5.12
## 798       38   m       80         30 246.96     38.89     15.87     6.16
## 799       39   f       80         30 269.89     40.32     14.99     5.34
## 842        2   m      123         15 314.00     39.90     16.17     5.48
## 843        3   f      123         15 306.00     36.71     15.20     6.40
## 844        4   m      123         15 252.50     37.97     13.32     4.71
## 845        5   f      123         15 321.50     43.52     16.65     6.76
## 846        6   m      123         15 241.00     36.35     16.30     5.72
## 847        7   m      123         15 257.00     34.92     15.23     6.22
## 849        9   m      123         15 318.00     40.49     16.62     6.54
## 850       10   f      123         15 298.00     37.88     15.66     6.36
## 851       11   f      123         15 323.50     37.81     13.45     6.75
## 852       12   m      123         15 250.00     37.24     14.49     5.13
## 853       13   f      123         15 288.00     35.16     16.39     5.38
## 854       14   f      123         15 303.00     39.42     14.33     6.86
## 855       15   m      123         15 246.50     38.03     16.69     5.53
## 857       17   f      123         15 361.00     40.15     15.22     7.80
## 859       19   f      123         15 297.00     42.75     16.07     6.41
## 860       20   m      123         15 255.00     35.86     15.37     6.62
## 861       21   m      123         30 264.00     36.03     15.76     5.64
## 862       22   f      123         30 321.00     40.14     16.30     6.80
## 863       23   m      123         30 254.00     39.23     15.10     6.02
## 864       24   m      123         30 224.00     38.07     15.83     5.78
## 865       25   m      123         30 223.50     36.99     16.72     6.05
## 866       26   m      123         30 235.00     38.71     15.41     5.72
## 867       27   f      123         30 278.50     39.03     16.22     6.22
## 868       28   f      123         30 251.50     39.07     14.71     7.03
## 869       29   f      123         30 375.00     45.32     18.14     6.44
## 870       30   f      123         30 304.50     39.22     15.59     5.53
## 871       31   f      123         30 290.00     37.93     16.04     6.34
## 872       32   m      123         30 228.50     37.28     15.76     5.91
## 873       33   m      123         30 261.50     39.30     13.78     5.56
## 874       34   m      123         30 273.50     37.17     17.36     5.71
## 875       35   f      123         30 279.50     39.61     15.15     7.51
## 876       36   f      123         30 291.00     38.78     16.98     6.49
## 878       38   m      123         30 256.00     38.98     16.44     6.26
## 879       39   f      123         30 286.50     39.32     14.73     5.93
##     width_mm NOTES
## 1       4.49      
## 2       4.44      
## 3       4.01      
## 4       4.22      
## 5       4.56      
## 6       3.73      
## 7       4.60      
## 8       4.66      
## 9       3.83      
## 10      3.89      
## 11      4.53      
## 12      3.89      
## 13      3.91      
## 14      4.12      
## 15      3.70      
## 16      4.25      
## 17      4.28      
## 18      3.92      
## 19      4.44      
## 20      4.39      
## 21      3.81      
## 22      4.16      
## 23      4.34      
## 24      4.29      
## 25      4.04      
## 26      4.09      
## 27      3.96      
## 28      4.07      
## 29      4.06      
## 30      3.97      
## 31      4.00      
## 32      4.22      
## 33      4.27      
## 34      4.15      
## 35      4.36      
## 36      4.57      
## 37      4.03      
## 38      4.44      
## 39      3.29      
## 40      3.97      
## 41      4.05      
## 42      4.41      
## 43      3.71      
## 44      4.21      
## 45      3.52      
## 46      4.44      
## 47      4.53      
## 48      3.86      
## 49      4.25      
## 50      3.99      
## 51      4.83      
## 52      4.10      
## 53      4.50      
## 54      4.25      
## 55      3.98      
## 56      4.45      
## 57      4.60      
## 58      3.84      
## 59      4.00      
## 60      4.31      
## 61      4.14      
## 62      4.86      
## 63      4.75      
## 64      3.64      
## 65      4.22      
## 66      4.06      
## 67      4.28      
## 68      3.76      
## 69      4.14      
## 70      4.41      
## 71      3.72      
## 72      4.06      
## 73      4.91      
## 74      3.92      
## 75      4.55      
## 76      4.21      
## 77      4.27      
## 78      4.01      
## 79      3.67      
## 81      4.54      
## 82      4.29      
## 83      4.20      
## 84      4.18      
## 85      4.51      
## 86      4.25      
## 87      4.35      
## 88      4.21      
## 89      3.88      
## 90      4.11      
## 91      4.42      
## 92      4.03      
## 93      4.64      
## 94      4.30      
## 95      3.89      
## 96      4.88      
## 97      4.46      
## 98      3.88      
## 99      4.44      
## 100     3.65      
## 101     4.37      
## 102     4.45      
## 103     4.68      
## 104     4.10      
## 105     3.83      
## 106     4.14      
## 107     4.26      
## 108     4.20      
## 109     4.76      
## 110     4.32      
## 111     3.80      
## 112     4.49      
## 113     4.70      
## 114     4.59      
## 115     4.82      
## 116     4.27      
## 117     4.41      
## 118     4.14      
## 119     4.27      
## 121     4.77      
## 122     4.48      
## 123     4.29      
## 124     4.56      
## 125     4.96      
## 126     4.53      
## 127     5.85      
## 128     4.68      
## 129     5.13      
## 130     4.66      
## 131     4.59      
## 132     4.60      
## 133     4.35      
## 134     4.77      
## 135     5.07      
## 136     5.01      
## 137     4.46      
## 138     4.44      
## 139     4.60      
## 140     4.59      
## 141     4.31      
## 142     4.29      
## 143     4.07      
## 144     3.94      
## 145     4.74      
## 146     4.09      
## 147     4.22      
## 148     4.49      
## 149     4.54      
## 150     4.36      
## 151     4.45      
## 152     4.75      
## 153     4.50      
## 154     4.38      
## 155     5.10      
## 156     4.86      
## 157     4.99      
## 158     4.32      
## 159     4.22      
## 162     4.53      
## 163     5.21      
## 164     4.35      
## 165     4.57      
## 166     4.89      
## 167     4.93      
## 169     4.72      
## 170     3.88      
## 171     4.53      
## 172     4.00      
## 173     4.24      
## 174     4.99      
## 175     4.12      
## 176     4.18      
## 177     5.04      
## 178     4.64      
## 179     4.30      
## 180     5.34      
## 181     4.56      
## 182     5.18      
## 183     4.32      
## 184     4.92      
## 185     4.90      
## 186     4.10      
## 187     4.88      
## 188     5.00      
## 189     5.15      
## 190     4.98      
## 191     5.20      
## 192     4.23      
## 193     4.76      
## 194     5.01      
## 195     5.70      
## 196     4.66      
## 197     4.98      
## 198     4.34      
## 199     4.30      
## 202     5.61      
## 203     4.05      
## 204     5.07      
## 205     4.25      
## 206     5.72      
## 207     5.10      
## 209     5.07      
## 210     4.93      
## 211     4.18      
## 212     4.68      
## 213     4.24      
## 214     4.71      
## 215     4.67      
## 216     5.15      
## 217     5.39      
## 218     4.38      
## 219     4.11      
## 220     4.54      
## 221     4.14      
## 222     4.92      
## 223     5.22      
## 224     4.25      
## 225     4.05      
## 226     4.64      
## 227     4.64      
## 228     4.89      
## 229     4.98      
## 230     4.77      
## 231     5.02      
## 232     4.23      
## 233     5.01      
## 234     4.71      
## 235     3.70      
## 236     4.65      
## 238     4.55      
## 239     5.37      
## 242     5.33      
## 243     4.26      
## 244     5.25      
## 245     4.97      
## 246     4.02      
## 247     5.34      
## 249     4.78      
## 250     4.63      
## 251     5.40      
## 252     4.08      
## 253     3.74      
## 254     4.24      
## 255     4.02      
## 256     4.44      
## 257     5.42      
## 258     4.75      
## 259     4.65      
## 260     5.38      
## 261     4.46      
## 262     5.06      
## 263     4.63      
## 264     5.10      
## 265     5.06      
## 266     5.64      
## 267     4.80      
## 268     5.10      
## 269     5.64      
## 270     4.82      
## 271     4.56      
## 272     4.28      
## 273     4.74      
## 274     4.50      
## 275     5.66      
## 276     5.40      
## 278     5.33      
## 279     5.30      
## 282     5.46      
## 283     4.19      
## 284     4.37      
## 285     5.51      
## 286     4.63      
## 287     5.22      
## 289     4.34      
## 290     4.32      
## 291     5.10      
## 292     4.61      
## 293     4.55      
## 294     4.63      
## 295     5.05      
## 296     5.68      
## 297     4.94      
## 298     4.96      
## 299     4.93      
## 300     4.80      
## 301     5.54      
## 302     4.59      
## 303     5.23      
## 304     4.05      
## 305     4.52      
## 306     4.93      
## 307     5.31      
## 308     4.80      
## 309     4.57      
## 310     4.57      
## 311     4.32      
## 312     4.76      
## 313     5.34      
## 314     4.92      
## 315     5.03      
## 316     4.65      
## 318     4.81      
## 319     5.00      
## 322     5.02      
## 323     5.20      
## 324     5.18      
## 325     5.40      
## 326     5.58      
## 327     5.31      
## 329     5.76      
## 330     5.36      
## 331     4.78      
## 332     4.69      
## 333     5.00      
## 334     5.04      
## 335     5.97      
## 336     5.57      
## 337     5.48      
## 338     5.15      
## 339     5.12      
## 340     4.87      
## 341     4.37      
## 342     5.39      
## 343     4.68      
## 344     4.50      
## 345     5.07      
## 346     4.32      
## 347     4.46      
## 348     4.96      
## 349     5.55      
## 350     5.20      
## 351     4.86      
## 352     5.06      
## 353     4.58      
## 354     6.25      
## 355     5.40      
## 356     4.88      
## 358     4.13      
## 359     5.09      
## 362     4.17      
## 363     4.97      
## 364     4.91      
## 365     5.06      
## 366     5.49      
## 367     5.08      
## 369     5.38      
## 370     5.70      
## 371     5.64      
## 372     4.77      
## 373     4.83      
## 374     5.14      
## 375     4.76      
## 376     5.15      
## 377     5.49      
## 378     5.06      
## 379     4.80      
## 380     5.03      
## 381     5.40      
## 382     4.82      
## 383     4.74      
## 384     4.69      
## 385     4.60      
## 386     4.52      
## 387     4.82      
## 388     4.55      
## 389     5.30      
## 390     4.47      
## 391     4.89      
## 392     5.09      
## 393     5.03      
## 394     5.21      
## 395     4.70      
## 396     4.51      
## 398     5.05      
## 399     4.84      
## 402     4.78      
## 403     4.46      
## 404     4.31      
## 405     5.11      
## 406     4.41      
## 407     5.64      
## 409     5.03      
## 410     3.99      
## 411     4.70      
## 412     4.43      
## 413     4.75      
## 414     4.71      
## 415     5.43      
## 416     5.38      
## 417     3.90      
## 418     3.72      
## 419     4.86      
## 420     4.74      
## 421     4.98      
## 422     4.13      
## 423     5.47      
## 424     5.11      
## 425     4.71      
## 426     4.96      
## 427     4.95      
## 428     5.28      
## 429     5.19      
## 430     4.61      
## 431     4.93      
## 432     5.39      
## 433     5.12      
## 434     5.17      
## 435     5.64      
## 436     5.54      
## 438     4.10      
## 439     4.95      
## 442     5.35      
## 443     4.91      
## 444     4.85      
## 445     4.50      
## 446     4.47      
## 447     4.43      
## 449     5.01      
## 450     4.87      
## 451     4.90      
## 452     4.60      
## 453     4.97      
## 454     5.26      
## 455     4.57      
## 456     5.41      
## 457     4.40      
## 458     4.87      
## 459     4.57      
## 460     4.54      
## 461     3.92      
## 462     5.28      
## 463     5.30      
## 464     5.03      
## 465     5.15      
## 466     5.00      
## 467     5.78      
## 468     5.74      
## 469     6.12      
## 470     4.89      
## 471     5.05      
## 472     4.90      
## 473     5.04      
## 474     4.56      
## 475     5.00      
## 476     5.64      
## 478     4.54      
## 479     4.77      
## 482     5.70      
## 483     5.02      
## 484     4.93      
## 485     4.73      
## 486     5.08      
## 487     4.89      
## 489     5.18      
## 490     4.85      
## 491     5.12      
## 492     4.55      
## 493     4.96      
## 494     4.74      
## 495     4.36      
## 496     4.79      
## 497     5.23      
## 498     5.41      
## 499     5.21      
## 500     4.87      
## 501     4.96      
## 502     4.57      
## 503     5.57      
## 504     5.31      
## 505     5.23      
## 506     5.64      
## 507     5.50      
## 508     4.90      
## 509     5.91      
## 510     5.46      
## 511     5.31      
## 512     6.25      
## 513     5.41      
## 514     6.07      
## 515     5.55      
## 516     5.81      
## 518     5.10      
## 519     5.02      
## 522     5.56      
## 523     4.79      
## 524     6.29      
## 525     4.99      
## 526     5.45      
## 527     5.78      
## 529     5.73      
## 530     4.92      
## 531     4.69      
## 532     4.75      
## 533     4.96      
## 534     4.59      
## 535     5.00      
## 536     4.89      
## 537     4.39      
## 538     4.61      
## 539     4.33      
## 540     4.67      
## 541     4.98      
## 542     4.47      
## 543     5.23      
## 544     5.07      
## 545     5.56      
## 546     4.87      
## 547     5.44      
## 548     5.58      
## 549     6.03      
## 550     4.89      
## 551     4.99      
## 552     4.99      
## 553     5.25      
## 554     5.29      
## 555     4.90      
## 556     5.15      
## 558     4.65      
## 559     4.95      
## 562     5.03      
## 563     4.44      
## 564     4.33      
## 565     5.70      
## 566     5.18      
## 567     5.43      
## 569     4.58      
## 570     4.74      
## 571     5.18      
## 572     5.05      
## 573     4.57      
## 574     5.50      
## 575     5.23      
## 576     5.80      
## 577     4.55      
## 578     4.38      
## 579     6.12      
## 580     4.80      
## 581     5.02      
## 582     5.06      
## 583     5.43      
## 584     4.48      
## 585     4.72      
## 586     5.47      
## 587     5.44      
## 588     4.83      
## 589     4.75      
## 590     4.45      
## 591     5.02      
## 592     5.06      
## 593     4.31      
## 594     4.39      
## 595     4.61      
## 596     5.00      
## 598     5.15      
## 599     5.05      
## 602     4.73      
## 603     4.39      
## 604     4.75      
## 605     4.74      
## 606     4.35      
## 607     4.33      
## 609     5.03      
## 610     4.26      
## 611     5.80      
## 612     5.48      
## 613     5.30      
## 614     5.19      
## 615     5.00      
## 616     5.06      
## 617     5.74      
## 618     4.40      
## 619     5.29      
## 620     4.49      
## 621     4.99      
## 622     5.28      
## 623     4.52      
## 624     4.18      
## 625     4.46      
## 626     4.24      
## 627     5.72      
## 628     4.82      
## 629     4.78      
## 630     4.09      
## 631     4.43      
## 632     5.02      
## 633     5.31      
## 634     5.19      
## 635     5.36      
## 636     4.37      
## 638     5.14      
## 639     5.10      
## 642     5.65      
## 643     4.91      
## 644     4.34      
## 645     5.14      
## 646     5.89      
## 647     5.07      
## 649     4.93      
## 650     4.50      
## 651     5.01      
## 652     5.14      
## 653     4.58      
## 654     4.53      
## 655     5.30      
## 656     4.87      
## 657     6.67      
## 658     4.43      
## 659     5.17      
## 660     5.33      
## 661     4.67      
## 662     4.93      
## 663     4.92      
## 664     3.63      
## 665     4.47      
## 666     4.89      
## 667     4.20      
## 668     5.05      
## 669     5.50      
## 670     4.77      
## 671     4.83      
## 672     4.66      
## 673     4.65      
## 674     5.35      
## 675     4.76      
## 676     4.70      
## 678     4.45      
## 679     5.04      
## 682     5.43      
## 683     4.95      
## 684     3.78      
## 685     5.45      
## 686     4.86      
## 687     5.27      
## 689     5.61      
## 690     5.45      
## 691     5.53      
## 692     5.56      
## 693     4.66      
## 694     5.30      
## 695     4.57      
## 696     4.99      
## 697     5.71      
## 698     5.67      
## 699     4.24      
## 700     3.70      
## 701     5.33      
## 702     5.04      
## 703     5.56      
## 704     4.40      
## 705     4.92      
## 706     4.50      
## 707     5.02      
## 708     5.13      
## 709     5.66      
## 710     5.41      
## 711     5.52      
## 712     5.43      
## 713     5.30      
## 714     4.82      
## 715     5.10      
## 716     5.23      
## 718     5.34      
## 719     5.11      
## 722     5.34      
## 723     4.93      
## 724     4.88      
## 725     5.68      
## 726     5.02      
## 727     5.03      
## 729     4.84      
## 730     4.98      
## 731     5.65      
## 732     5.41      
## 733     5.03      
## 734     4.57      
## 735     5.36      
## 736     5.09      
## 737     5.23      
## 738     4.87      
## 739     5.43      
## 740     5.06      
## 741     4.59      
## 742     5.45      
## 743     4.81      
## 744     4.92      
## 745     4.85      
## 746     4.98      
## 747     4.72      
## 748     4.12      
## 749     5.60      
## 750     5.27      
## 751     5.77      
## 752     5.46      
## 753     4.86      
## 754     4.67      
## 755     5.15      
## 756     4.93      
## 758     5.44      
## 759     4.36      
## 762     4.78      
## 764     4.81      
## 765     5.23      
## 766     4.72      
## 767     6.07      
## 769     4.55      
## 770     5.46      
## 771     4.46      
## 772     5.32      
## 773     5.52      
## 774     4.89      
## 775     6.14      
## 777     4.26      
## 778     4.47      
## 779     5.95      
## 780     5.83      
## 781     4.03      
## 782     4.50      
## 783     4.93      
## 784     5.45      
## 785     4.25      
## 786     5.65      
## 787     5.37      
## 788     5.43      
## 789     6.92      
## 790     5.14      
## 791     5.34      
## 792     5.27      
## 793     5.50      
## 794     5.50      
## 795     5.50      
## 796     5.61      
## 798     5.27      
## 799     5.03      
## 842     4.64      
## 843     5.76      
## 844     4.14      
## 845     4.80      
## 846     5.80      
## 847     4.90      
## 849     4.78      
## 850     5.12      
## 851     4.94      
## 852     4.83      
## 853     5.01      
## 854     5.00      
## 855     4.55      
## 857     4.99      
## 859     4.56      
## 860     4.61      
## 861     4.18      
## 862     4.83      
## 863     5.16      
## 864     4.16      
## 865     5.44      
## 866     4.84      
## 867     4.38      
## 868     5.48      
## 869     5.22      
## 870     5.10      
## 871     4.95      
## 872     4.75      
## 873     4.90      
## 874     4.96      
## 875     5.34      
## 876     5.06      
## 878     5.69      
## 879     4.19

5a) Three fits (10 points) To begin with, I’d like you to fit the relationship that describes how Tarsus (leg) length predicts upper beak (Culmen) length. Fit this relationship using least squares, likelihood, and Bayesian techniques. For each fit, demonstrate that the necessary assumptions have been met. Note, functions used to fit with likelihood and Bayes may or may not behave well when fed NAs. So look out for those errors.

#LEAST SQUARES####
ggplot(data= morphology, mapping=aes(x=tarsus_mm,
                                     y=culmen_mm))  +
  geom_point() #first graph the relationship of tarsus on culmen

lm_morph <- lm(data=morphology, culmen_mm~tarsus_mm) #fit a lm to find slope/ y-int
lm_morph  #y-int =  -0.09871     slope= 0.37293  
## 
## Call:
## lm(formula = culmen_mm ~ tarsus_mm, data = morphology)
## 
## Coefficients:
## (Intercept)    tarsus_mm  
##    -0.09871      0.37293
summary(lm_morph)
## 
## Call:
## lm(formula = culmen_mm ~ tarsus_mm, data = morphology)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -4.4081 -0.7029 -0.0328  0.7263  3.5970 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -0.098707   0.215450  -0.458    0.647    
## tarsus_mm    0.372927   0.006646  56.116   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.238 on 764 degrees of freedom
## Multiple R-squared:  0.8048, Adjusted R-squared:  0.8045 
## F-statistic:  3149 on 1 and 764 DF,  p-value: < 2.2e-16
#plot the data from above but add in line with the slope found from our linear model found above
plot(data = morphology, culmen_mm~tarsus_mm) + 
  abline(lm_morph)

## integer(0)
#run cor test to see if tarsus does influence culmen length 
cor.test(morphology$culmen_mm, morphology$tarsus_mm) 
## 
##  Pearson's product-moment correlation
## 
## data:  morphology$culmen_mm and morphology$tarsus_mm
## t = 56.116, df = 764, p-value < 2.2e-16
## alternative hypothesis: true correlation is not equal to 0
## 95 percent confidence interval:
##  0.8823111 0.9100845
## sample estimates:
##       cor 
## 0.8970803
#get 0.8970803 so there seems to be a strong corr. Reject null that says no correlation 

plot(lm_morph, which=2) #look at qq plot. See there is some dev from line at both ends

plot(lm_morph, which=5) #cooks distance makes me think we get some var as we move out due to having fewer birds with large tarsus lengths

#LIKELIHOOD####
#initial visualization to determine if lm is appropriate
ggplot(data= morphology, mapping=aes(x=tarsus_mm,
                                     y=culmen_mm))  +
  geom_point() #is there relation between x and y var... seems like it! so we'll fit that model

morph_mod <- glm(culmen_mm ~ tarsus_mm, #create a glm with (y~x)
               family = gaussian(link = "identity"),  #identity b/c not transforming our data
               data=morphology)
morph_mod
## 
## Call:  glm(formula = culmen_mm ~ tarsus_mm, family = gaussian(link = "identity"), 
##     data = morphology)
## 
## Coefficients:
## (Intercept)    tarsus_mm  
##    -0.09871      0.37293  
## 
## Degrees of Freedom: 765 Total (i.e. Null);  764 Residual
## Null Deviance:       6001 
## Residual Deviance: 1172  AIC: 2505
#assumptions
morph_fit <- predict(morph_mod) #predictions from the model create above
morph_fit
##         1         2         3         4         5         6         7         8 
##  7.128614  7.501540  7.001819  7.400850  8.038555  7.296431  7.975157  6.684831 
##         9        10        11        12        13        14        15        16 
##  7.136072  7.699192  7.348640  6.834002  6.863836  7.598501  6.699748  7.952782 
##        17        18        19        20        21        22        23        24 
##  7.449331  6.487180  7.494082  6.613975  7.203199  8.232477  6.807897  7.068945 
##        25        26        27        28        29        30        31        32 
##  7.445601  7.054028  6.587870  7.635794  8.098223  6.871294  7.535104  7.371016 
##        33        34        35        36        37        38        39        40 
##  7.247950  7.206928  7.229304  7.292701  7.702921  7.087592  7.132343  6.994360 
##        41        42        43        44        45        46        47        48 
##  6.807897  7.318806  7.594772  7.329994  7.068945  7.143531  8.568111  6.897399 
##        49        50        51        52        53        54        55        56 
##  7.825987  7.441872  7.009277  8.325709  7.523916  5.938977  6.904858  8.624050 
##        57        58        59        60        61        62        63        64 
##  8.221289  6.696019  8.008721  7.326265  7.244221  8.325709  7.296431  6.778063 
##        65        66        67        68        69        70        71        72 
##  7.867009  6.916045  7.068945  6.834002  8.098223  7.042841  7.960240  7.453060 
##        73        74        75        76        77        78        79        81 
##  7.441872  7.393392  7.550021  7.247950  7.654440  8.131787  6.949609  7.956511 
##        82        83        84        85        86        87        88        89 
##  8.448774  8.336896  8.594216  9.437030  8.441316  8.672531  7.274055  7.893114 
##        90        91        92        93        94        95        96        97 
##  8.344355  8.474879  7.881926  8.154162  8.642696  7.617148  8.765762  8.251123 
##        98        99       100       101       102       103       104       105 
##  7.285243  9.373633  7.512728  8.206372  9.269213 11.085367  8.486067  8.038555 
##       106       107       108       109       110       111       112       113 
##  7.781236  8.236206  8.463692  9.004435  7.818528  8.717282  8.907474  7.527645 
##       114       115       116       117       118       119       121       122 
##  8.049743  8.538277  8.008721  8.187726  9.120043  8.381648  7.367287  8.907474 
##       123       124       125       126       127       128       129       130 
##  8.430128  8.590487 10.317138  8.706094  9.630952  7.591043  9.272943  9.291589 
##       131       132       133       134       135       136       137       138 
##  9.157335  8.422670  8.437587  8.929850  8.113140  9.724184  9.011894  8.258582 
##       139       140       141       142       143       144       145       146 
##  8.892557 12.629284 10.007608  9.616035  8.624050  9.347528  8.735928  8.247394 
##       147       148       149       150       151       152       153       154 
##  8.918662  9.377362  9.862167  8.512172  9.299047  8.534548  9.108855  9.030540 
##       155       156       157       158       159       162       163       164 
##  9.194628  8.877640  8.381648  9.321423  7.889384 10.018796  9.041728  8.247394 
##       165       166       167       169       170       171       172       173 
## 10.842964  8.844077 10.279845 10.044901  9.809957  9.261755  8.967143  8.996977 
##       174       175       176       177       178       179       180       181 
##  9.112584  8.788138 10.630396  9.489240  7.844633  9.988962  8.273499  9.947940 
##       182       183       184       185       186       187       188       189 
## 11.312852 10.022525  9.828604  9.746560  9.537721  8.366731 10.250011 10.320867 
##       190       191       192       193       194       195       196       197 
##  9.202087  9.377362 10.320867 10.097111  9.727913  9.123772  9.604847  9.750289 
##       198       199       202       203       204       205       206       207 
##  9.604847 10.242552 10.242552  9.690621  9.660786 10.992135  9.205816 10.898903 
##       209       210       211       212       213       214       215       216 
## 10.410369 10.126945  9.716725  9.000706  9.336340 10.059818  7.632065 11.633569 
##       217       218       219       220       221       222       223       224 
##  9.754018  8.482338 10.235094  9.302777 10.231364 11.223350  9.649599  9.992691 
##       225       226       227       228       229       230       231       232 
## 10.033713  9.079021 10.414099  9.899460 10.719898  9.798769  9.224462 10.302221 
##       233       234       235       236       238       239       242       243 
##  9.996421 10.432745 10.566999  9.321423  9.668245 10.675147 10.958572 10.257469 
##       244       245       246       247       249       250       251       252 
## 11.245725 11.816303  9.843521 11.596276 11.100284 11.689508 10.932467  9.310235 
##       253       254       255       256       257       258       259       260 
##  9.451947 10.481225  8.970872 10.235094 10.074735  9.537721 10.716169 10.272386 
##       261       262       263       264       265       266       267       268 
##  9.992691 12.595720  9.988962 10.492413 10.772108 10.238823  9.433301 10.268657 
##       269       270       271       272       273       274       275       276 
## 10.995864 10.231364  9.668245 10.645313 10.059818 10.525977 10.742274  9.996421 
##       278       279       282       283       284       285       286       287 
## 11.573901 11.786469 11.458294 10.395452 10.343242 12.580803 14.609525 10.298491 
##       289       290       291       292       293       294       295       296 
## 11.036886 12.043789 11.581359  9.627223  9.492969 10.123216  9.082750 12.677764 
##       297       298       299       300       301       302       303       304 
## 10.790755 10.108299 10.563269 10.011338 10.634125 12.200418 11.976662 10.343242 
##       305       306       307       308       309       310       311       312 
## 10.939925 10.921279 10.529706  9.869625 11.764093 10.466308 10.876528 10.402911 
##       313       314       315       316       318       319       322       323 
## 11.085367 10.682606 11.350145 10.708711 12.058706 12.618096 11.737989 11.014511 
##       324       325       326       327       329       330       331       332 
## 11.622381 13.349032 11.428459 12.778454 11.156223 12.375693 11.823762 10.406640 
##       333       334       335       336       337       338       339       340 
## 10.350701 11.678320  9.597389 12.442820 11.723072 10.059818 11.488128 11.816303 
##       341       342       343       344       345       346       347       348 
## 11.450835 12.722515 11.260642 11.696967 13.099171 11.171140 11.182328 11.100284 
##       349       350       351       352       353       354       355       356 
## 12.946271 11.655945 10.936196 10.783296 11.939369 12.979835 10.708711 10.816859 
##       358       359       362       363       364       365       366       367 
## 12.573345 12.625554 12.797101  8.672531 11.379979 13.460910 12.088540 13.024586 
##       369       370       371       372       373       374       375       376 
## 12.207876 13.024586 14.322371 10.947384 10.787025 11.670862  9.112584 13.121547 
##       377       378       379       380       381       382       383       384 
## 10.455120 11.126389 12.897791 11.946828 12.137020 13.058149 11.946828 12.506218 
##       385       386       387       388       389       390       391       392 
## 13.244613 12.353318 11.663403 12.908979 13.591435 11.678320 12.647930 12.338401 
##       393       394       395       396       398       399       402       403 
## 11.730530 12.715057 12.200418 12.718786 12.808288 13.184944 13.326657 12.084810 
##       404       405       406       407       409       410       411       412 
## 12.327213 14.150825 12.118374 13.509391 13.177486 12.640471 12.737432 11.443376 
##       413       414       415       416       417       418       419       420 
## 11.432189 11.909535 11.107742 13.114088 11.805115 11.130118 13.826378 12.084810 
##       421       422       423       424       425       426       427       428 
## 12.983564 13.326657 13.147652 12.968647 13.117818 12.785913 11.976662 12.894062 
##       429       430       431       432       433       434       435       436 
## 15.109247 13.211049 12.752349 13.598893 12.953730 13.852483 13.804003 13.065608 
##       438       439       442       443       444       445       446       447 
## 13.438535 13.367679 13.736876 12.841852 12.334671 14.777342 13.718230 14.389498 
##       449       450       451       452       453       454       455       456 
## 13.218508 12.666576 13.554142 12.002767 11.875971 13.102901 11.816303 13.930798 
##       457       458       459       460       461       462       463       464 
## 12.349588 11.883430 13.554142 11.059262 12.961188 14.292537 12.588262 13.345303 
##       465       466       467       468       469       470       471       472 
## 13.628727 13.487015 12.054976 12.785913 15.567947 13.494474 13.662291 12.744891 
##       473       474       475       476       478       479       482       483 
## 12.912708 13.211049 12.424174 12.845581 13.449722 13.796544 13.099171 11.387437 
##       484       485       486       487       489       490       491       492 
## 13.401242 13.826378 13.017127 13.960632 14.299995 13.173757 13.956903 11.644757 
##       493       494       495       496       497       498       499       500 
## 12.517406 13.624998 12.095998 14.408144 12.841852 11.894618 14.732591 12.081081 
##       501       502       503       504       505       506       507       508 
## 12.946271 14.553586 13.602622 13.248342 13.748064 13.244613 12.845581 13.386325 
##       509       510       511       512       513       514       515       516 
## 15.530654 13.445993 12.819476 13.557871 12.901520 13.900964 14.244056 13.147652 
##       518       519       522       523       524       525       526       527 
## 14.400686 13.729418 14.855656 12.614366 13.479557 14.766154 13.833837 13.725688 
##       529       530       531       532       533       534       535       536 
## 14.467813 14.482730 13.774169 12.882874 12.834393 14.396956 12.614366 14.221681 
##       537       538       539       540       541       542       543       544 
## 14.534939 12.591991 15.314356 13.378866 13.449722 14.672922 14.684110 14.486459 
##       545       546       547       548       549       550       551       552 
## 13.132735 14.579691 13.822649 13.375137 16.548744 14.415603 14.050135 14.437978 
##       553       554       555       556       558       559       562       563 
## 13.807732 14.262703 15.112976 14.885491 14.061322 14.035217 14.531210 13.203591 
##       564       565       566       567       569       570       571       572 
## 14.143366 13.628727 13.975549 14.139637 14.721403 13.214779 13.266988 12.856769 
##       573       574       575       576       577       578       579       580 
## 12.349588 14.810905 13.561600 13.662291 13.941986 12.800830 14.978722 13.762981 
##       581       582       583       584       585       586       587       588 
## 12.774725 14.758695 13.699583 12.793371 13.479557 14.549856 13.930798 13.822649 
##       589       590       591       592       593       594       595       596 
## 16.574849 15.374025 13.983008 14.281349 14.359664 14.210493 13.837566 13.196132 
##       598       599       602       603       604       605       606       607 
## 14.736320 14.773612 14.669193 13.087983 14.001654 14.643088 14.363393 14.479000 
##       609       610       611       612       613       614       615       616 
## 15.124164 13.572788 13.483286 12.890332 12.573345 14.288808 13.744335 13.569059 
##       617       618       619       620       621       622       623       624 
## 13.449722 12.416715 15.202478 11.715613 14.426791 15.019744 14.937700 13.490744 
##       625       626       627       628       629       630       631       632 
## 13.576518 14.061322 13.531766 14.493917 17.175261 14.240327 12.625554 14.091157 
##       633       634       635       636       638       639       642       643 
## 13.289364 14.672922 14.035217 13.311740 15.083142 14.225410 14.378310 12.666576 
##       644       645       646       647       649       650       651       652 
## 13.785357 13.852483 13.155110 13.662291 14.803447 13.572788 13.669749 12.293649 
##       653       654       655       656       657       658       659       660 
## 12.946271 13.841296 14.285078 14.810905 14.195576 11.734259 15.303169 12.591991 
##       661       662       663       664       665       666       667       668 
## 13.199862 14.117261 13.766710 12.662847 15.038391 13.483286 13.240883 12.950001 
##       669       670       671       672       673       674       675       676 
## 16.209381 13.211049 14.191847 12.972376 13.610081 14.173200 13.371408 14.374581 
##       678       679       682       683       684       685       686       687 
## 13.244613 13.792815 13.785357 13.222237 13.285635 13.494474 12.588262 11.868513 
##       689       690       691       692       693       694       695       696 
## 13.815191 11.663403 14.020300 13.804003 12.554698 14.352205 12.215335 13.997925 
##       697       698       699       700       701       702       703       704 
## 14.840739 12.841852 14.255244 13.624998 12.782184 14.001654 13.714500 13.725688 
##       705       706       707       708       709       710       711       712 
## 13.997925 13.509391 14.501376 13.542954 15.966978 13.375137 12.640471 13.341574 
##       713       714       715       716       718       719       722       723 
## 13.964361 13.621269 14.743778 13.971820 13.762981 14.568503 14.613254 13.170027 
##       724       725       726       727       729       730       731       732 
## 14.333559 13.830108 11.898347 13.412430 14.482730 14.094886 14.296266 12.051247 
##       733       734       735       736       737       738       739       740 
## 12.095998 15.098059 13.076796 13.132735 14.091157 12.916437 15.183832 17.622773 
##       741       742       743       744       745       746       747       748 
## 13.557871 14.382039 14.505105 12.461467 14.072510 14.613254 13.535496 12.051247 
##       749       750       751       752       753       754       755       756 
## 16.071398 15.579134 13.830108 13.680937 14.501376 13.647374 14.531210 13.755522 
##       758       759       762       764       765       766       767       769 
## 15.206208 13.908422 14.840739 14.203035 12.748620 13.591435 12.901520 14.296266 
##       770       771       772       773       774       775       777       778 
## 13.960632 13.043232 11.566442 13.211049 14.441708 12.726245 14.277620 12.498759 
##       779       780       781       782       783       784       785       786 
## 14.374581 13.453452 14.631900 14.180659 14.180659 13.751793 13.934527 14.855656 
##       787       788       789       790       791       792       793       794 
## 15.429964 13.542954 18.331334 14.266432 14.639359 13.531766 14.225410 15.370295 
##       795       796       798       799       842       843       844       845 
## 15.325544 13.423618 14.404415 14.937700 14.781071 13.591435 14.061322 16.131066 
##       846       847       849       850       851       852       853       854 
## 13.457181 12.923896 15.001098 14.027759 14.001654 13.789086 13.013398 14.602066 
##       855       857       859       860       861       862       863       864 
## 14.083698 14.874303 15.843912 13.274447 13.337844 14.870573 14.531210 14.098615 
##       865       866       867       868       869       870       871       872 
## 13.695854 14.337288 14.456625 14.471542 16.802334 14.527481 14.046405 13.804003 
##       873       874       875       876       878       879 
## 14.557315 13.762981 14.672922 14.363393 14.437978 14.564774
morph_res <- residuals(morph_mod) #extracts residuals of model
morph_res
##            1            2            3            4            5            6 
##  0.511386306 -0.011540465  0.308181408 -0.060850236  0.201444986 -0.476430741 
##            7            8            9           10           11           12 
## -0.135157463  0.705169163  0.263927771  0.110808347 -0.488640489  0.995998455 
##           13           14           15           16           17           18 
## -0.013835687 -0.998501425 -0.229747907  0.707218143 -0.019330717  0.352820352 
##           19           20           21           22           23           24 
## -0.104081929  1.506025250  0.006800952  0.507523065 -0.107896671 -0.408945410 
##           25           26           27           28           29           30 
##  1.014398551 -0.074028340  0.272130124 -0.625794102  0.901776702  0.488705778 
##           31           32           33           34           35           36 
##  0.224896126 -0.441016095  0.662049740  0.463071684 -0.009303922 -0.032701473 
##           37           38           39           40           41           42 
##  0.167079079 -0.447591749 -0.072342962 -0.394360056  1.072103329  0.631193653 
##           43           44           45           46           47           48 
##  0.155227843 -0.509994150  0.481054590 -2.913530765 -0.128111029  0.592600904 
##           49           50           51           52           53           54 
## -0.025986755  1.058127819  0.990722873  0.184291372 -0.033916071  2.791022705 
##           55           56           57           58           59           60 
## -0.124857631  0.195949956 -0.751289132 -0.016018640 -0.628720873  0.053735118 
##           61           62           63           64           65           66 
## -0.064220993 -0.415708628 -0.506430741  0.081937471  0.022991300 -0.816045434 
##           67           68           69           70           71           72 
##  0.831054590 -0.054001545  0.721776702  0.517159463 -0.350240393  0.336940016 
##           73           74           75           76           77           78 
##  0.888127819 -0.283391701 -0.280020945  0.552049740 -0.164440441 -0.921786707 
##           79           81           82           83           84           85 
##  2.910391156 -1.116511125  0.011225538 -0.376896431 -0.444215903 -1.057030405 
##           86           87           88           89           90           91 
## -0.101315927 -0.182530525  0.545944866  0.406886426  0.765645034 -0.424879336 
##           92           93           94           95           96           97 
## -0.671925771  0.295837687 -0.362696383  1.482852237  0.394237783 -0.961123274 
##           98           99          100          101          102          103 
## -1.245242938 -1.683632854  1.477271732 -0.396372061 -0.409213358 -3.045366731 
##          104          105          106          107          108          109 
## -0.526067139 -0.208555014  0.158764457 -0.106206203  0.176308467 -0.264435351 
##          110          111          112          113          114          115 
##  0.221471780 -0.977281737 -1.157474390  0.442354662  0.110257182  0.381723113 
##          116          117          118          119          121          122 
## -0.368720873 -0.547725723 -0.790042650 -1.701647644  0.602713173 -0.677474390 
##          123          124          125          126          127          128 
##  1.339871876 -0.690486635 -0.837137584  0.233906066 -0.970952325  1.028957110 
##          129          130          131          132          133          134 
## -0.632942626 -0.351588964 -1.077335327  0.727330412  0.312413341  0.210150004 
##          135          136          137          138          139          140 
##  0.656859631 -0.464184018 -0.561893886  0.241418191  0.507442681 -3.269283562 
##          141          142          143          144          145          146 
## -2.307608364 -0.526035255 -0.774050044 -1.877527980  0.164071924 -0.307394006 
##          147          148          149          150          151          152 
## -1.008662193 -0.467362121 -0.362166923 -0.822172013 -0.729047500 -1.174547620 
##          153          154          155          156          157          158 
## -1.158854846 -1.410540225 -0.074628004  0.092359751 -0.991647644 -0.601423106 
##          159          162          163          164          165          166 
##  1.090615694 -0.688796167  0.348271972  0.542605994  0.217035670  0.675923161 
##          167          169          170          171          172          173 
## -0.399844907 -0.934901041 -0.989957175 -0.391754822  0.032857326 -1.566976815 
##          174          175          176          177          178          179 
## -0.432584114 -0.068137824 -0.030396071  0.620759847  1.155366906 -0.348962025 
##          180          181          182          183          184          185 
##  1.676501120 -0.577940081 -1.592852062 -0.062525435 -0.838603514  1.023440376 
##          186          187          188          189          190          191 
## -0.797720633  0.453269427 -0.220010765 -0.600866851 -0.652086539 -0.497362121 
##          192          193          194          195          196          197 
## -1.390866851 -2.457110789 -0.457913286  1.126228083 -0.274847452 -0.240288892 
##          198          199          202          203          204          205 
## -1.544847452 -1.402552230 -1.322552230 -1.600620609  0.009213533 -0.962135039 
##          206          207          209          210          211          212 
##  0.484184193 -0.298903346 -0.770369276 -0.086944931 -0.386725483  0.549293917 
##          213          214          215          216          217          218 
## -0.456340177 -0.729818112  1.657935166 -0.213569084  1.075981840  0.457662128 
##          219          220          221          222          223          224 
## -0.335093694  0.417223233 -1.271364426 -2.113349637  0.620401336 -0.782691293 
##          225          226          227          228          229          230 
##  1.316286762  0.160979295 -0.854098544  0.080540400 -0.939898496  0.871230628 
##          231          232          233          234          235          236 
##  0.365537855 -0.802220513 -0.136420561  0.397255117  0.213001480  0.788576894 
##          238          239          242          243          244          245 
## -0.748245003 -1.415147284 -0.048571629 -0.577469300 -1.445725243  0.963696798 
##          246          247          249          250          251          252 
##  1.076479415 -0.686276407 -0.350283802 -0.829508100 -0.482466755 -0.080235303 
##          253          254          255          256          257          258 
##  0.188052524 -0.061225363  1.169128059  0.844906306  0.875264817 -0.037720633 
##          259          260          261          262          263          264 
##  0.203830772 -1.222386371 -0.012691293 -2.095720153  0.891037975 -1.222413166 
##          265          266          267          268          269          270 
##  2.827891756 -0.248822962  0.836698863  0.021342897 -0.495864306  0.428635574 
##          271          272          273          274          275          276 
##  0.371754997 -0.335313142  0.250181888 -0.165976575  0.007725898 -0.686420561 
##          278          279          282          283          284          285 
## -2.263900801 -3.636469060 -0.608293502 -1.615452206  0.726757542 -0.280803082 
##          286          287          289          290          291          292 
## -3.639524715 -2.878491245 -1.616886251 -1.363788532 -1.701359336  0.572776942 
##          293          294          295          296          297          298 
## -0.442969420  0.266784337  0.547250027 -1.397764042  0.299245417  0.361701408 
##          299          300          301          302          303          304 
##  0.716730748  0.258662368 -0.884125339 -0.520417776 -1.696661713 -0.823242458 
##          305          306          307          308          309          310 
##  2.030074709  0.998721048 -1.599705843  0.600374541 -0.624093454 -0.766308292 
##          311          312          313          314          315          316 
## -1.266527740  0.187089259 -0.265366731  1.237394181 -0.950144739 -1.338710693 
##          318          319          322          323          324          325 
## -1.678705603 -2.748095759 -0.167988580 -1.134510645 -1.022381281  0.650967770 
##          326          327          329          330          331          332 
##  0.641540640 -2.338454271 -0.606222818  0.434306642 -1.813761737  0.503359991 
##          333          334          335          336          337          338 
## -0.290700993 -1.228320297  0.922611084 -0.682820177  0.916928491  0.110181888 
##          339          340          341          342          343          344 
## -4.408127644 -1.456303202 -0.360834967  0.257484745  0.479357686 -0.696966635 
##          345          346          347          348          349          350 
## -1.399171293  0.818860111  0.487672308  0.049716198 -1.396271317 -1.615944691 
##          351          352          353          354          355          356 
## -0.536196023  0.116703953 -0.719369036 -0.599834727  1.681289307  0.263140544 
##          358          359          362          363          364          365 
## -1.683344547 -1.235554295 -2.387100609  2.207469475  0.140021120 -0.740910261 
##          366          367          369          370          371          372 
##  0.081460255  0.445414061 -2.357876311 -0.694585939 -4.192371101 -0.707383826 
##          373          374          375          376          377          378 
## -0.227025315  0.459138239  3.077415886  0.318453100  0.724879511 -0.046388676 
##          379          380          381          382          383          384 
## -0.377790837 -0.796827572  0.552979775 -0.548149349  0.343172428 -0.176217728 
##          385          386          387          388          389          390 
##  0.155387266  1.016682248 -0.213403226  0.391021360 -0.461434631  0.681679703 
##          391          392          393          394          395          396 
## -1.037929901 -1.118400681 -1.580530045  1.204943280  0.089582224 -1.858785987 
##          398          399          402          403          404          405 
## -0.168288412 -1.384944451 -2.326656624 -1.454810477 -0.447212878  0.149175213 
##          406          407          409          410          411          412 
##  0.311626114 -0.609390741 -1.557485915  0.739528635 -2.167432326  1.186623569 
##          413          414          415          416          417          418 
## -3.992188628  0.550465105 -0.377742338 -0.704088364  1.744884601 -0.050117944 
##          419          420          421          422          423          424 
## -2.686378496 -0.294810477 -1.213563994  0.363343376 -0.227651774 -2.478646924 
##          425          426          427          428          429          430 
##  1.352182368  0.204087194  0.933338287 -1.984061570 -2.249246588 -0.221049325 
##          431          432          433          434          435          436 
## -0.332349397 -0.958893166 -0.543729853 -1.652483370 -1.334002890 -1.315607884 
##          438          439          442          443          444          445 
## -0.908534655 -1.237678568 -1.226876071 -0.911851822 -1.294671413 -0.607341762 
##          446          447          449          450          451          452 
## -0.478229733 -1.859497920 -0.448507860  0.443423761 -0.564141954  0.417233413 
##          453          454          455          456          457          458 
##  0.874028515 -2.632900561 -0.166303202  0.359202008  1.440411516  1.306569979 
##          459          460          461          462          463          464 
##  0.355858046  2.040738143 -1.221188388  0.217463040 -0.828261618 -2.015302962 
##          465          466          467          468          469          470 
## -1.248727308 -0.427015135  1.365023665 -0.865912806 -1.087946516 -0.484473670 
##          471          472          473          474          475          476 
## -0.122290717  0.005109139  1.337292092 -0.221049325  0.015826162  0.354418911 
##          478          479          482          483          484          485 
## -0.299722458 -0.996544355 -1.389171293  0.992562584 -0.131241978  1.793621504 
##          486          487          489          490          491          492 
##  1.252872596 -0.080632134  0.090004505  0.056243352 -2.516902866  1.105243112 
##          493          494          495          496          497          498 
## -0.207405531 -0.354998040  1.474001720 -0.378144259  0.388148178 -0.404617824 
##          499          500          501          502          503          504 
## -2.212590549 -1.071081209 -1.626271317 -0.913585699  0.287377566 -2.148342002 
##          505          506          507          508          509          510 
## -0.138063875 -1.254612734 -1.985581089 -0.196324907 -2.630653839  0.064006810 
##          511          512          513          514          515          516 
##  0.440523785 -2.987871221  0.288479895 -0.620963851 -0.024056480  0.182348226 
##          518          519          522          523          524          525 
## -1.340685723 -0.319417536 -0.825656384 -0.264366491 -0.309556600  0.713846041 
##          526          527          529          530          531          532 
## -0.023837032  0.694311732 -0.827812542 -0.472729613 -0.404168748 -0.572873766 
##          533          534          535          536          537          538 
##  0.125606714 -0.346956456 -1.664366491  0.918319127  0.915060639  1.138009115 
##          539          540          541          542          543          544 
## -1.004356312 -0.648866371 -1.469722458 -1.182922266 -0.564110069 -1.856458881 
##          545          546          547          548          549          550 
##  1.057265297 -0.679690573  0.757350771  1.214862896  0.831256077  0.734397206 
##          551          552          553          554          555          556 
## -0.460134559  0.912021600 -1.607732158 -0.122702818 -0.422975855 -1.495490525 
##          558          559          562          563          564          565 
## -2.951322362  0.114782512 -1.041210093 -1.963590789 -1.153366252  2.721272692 
##          566          567          569          570          571          572 
##  0.244450795  0.450363016 -1.161402746  2.485221408 -0.936988340 -0.476768892 
##          573          574          575          576          577          578 
##  1.420411516 -1.220905171 -0.411600489  1.767709283 -0.021985795 -0.470829877 
##          579          580          581          582          583          584 
## -0.938722218 -0.222980945  0.215274997  1.741304577  0.580416606  1.036628659 
##          585          586          587          588          589          590 
##  2.400443400 -1.689856432  1.179202008  1.177350771  0.015151203 -1.964024595 
##          591          592          593          594          595          596 
##  0.466992260  0.598650843  0.340336221 -2.060493070  1.502433700  0.703867746 
##          598          599          602          603          604          605 
##  0.573680183 -0.173612494 -0.829192998 -0.347983490 -0.041654079  0.566911876 
##          606          607          609          610          611          612 
##  0.236606954  0.140999655 -1.904163659  1.457211708  0.576714133  0.189667698 
##          613          614          615          616          617          618 
## -0.653344547 -2.718807692  0.955665393  1.670940975  1.970277542  2.383284697 
##          619          620          621          622          623          624 
##  0.557521720  1.054387026 -1.576790597 -0.199744163  0.072299727 -1.630744403 
##          625          626          627          628          629          630 
##  1.303482440 -0.881322362  1.038233653  0.266082584 -0.855260898  1.469672788 
##          631          632          633          634          635          636 
##  1.664445705 -0.501156504  0.040636053 -1.982922266  1.264782512 -0.391739553 
##          638          639          642          643          644          645 
## -0.003141714 -0.365410141 -0.698310117  1.743423761 -0.115356552  2.047516630 
##          646          647          649          650          651          652 
##  1.454889691 -0.582290717 -2.853446636  0.807211708 -2.449749253  1.646350531 
##          653          654          655          656          657          658 
##  1.523728683 -1.661295567 -1.265078424  0.189094829  0.994424001  2.345740688 
##          659          660          661          662          663          664 
## -1.293168509 -0.421990885  0.540138478  1.452738622  0.713289787  2.877153028 
##          665          666          667          668          669          670 
## -0.188390501  1.556714133  1.479116534  1.129999415 -0.699380561  3.078950675 
##          671          672          673          674          675          676 
##  2.568153268  0.767623809  0.489919031  1.446799607  2.648592164  1.225419151 
##          678          679          682          683          684          685 
##  1.975387266  2.447184913  0.684643448 -0.692237128 -0.565634679  1.805526330 
##          686          687          689          690          691          692 
## -0.038261618  2.301487050 -1.855190693  0.636596774 -0.370300417  0.795997110 
##          693          694          695          696          697          698 
##  1.055301792 -2.092205243  1.514665153  1.492075189 -0.680739313 -1.401851822 
##          699          700          701          702          703          704 
##  0.904755717 -0.264998040  1.527816462  0.408345921 -0.804500465 -0.805688268 
##          705          706          707          708          709          710 
##  1.002075189  0.090609259  1.348624049 -0.152954151  0.453021840  2.034862896 
##          711          712          713          714          715          716 
##  2.919528635  0.708426306  1.035638598  3.058731228 -0.723778352  0.888180063 
##          718          719          722          723          724          725 
## -0.032980945 -0.628502770 -0.773253983  1.149972620 -1.273558905  1.559892236 
##          726          727          729          730          731          732 
##  0.801652908  1.717570219  0.697270387  0.805114229 -1.656266228  0.778752932 
##          733          734          735          736          737          738 
##  2.834001720 -1.018058785  0.063204313  1.737265297  2.068843496  0.163562824 
##          739          740          741          742          743          744 
## -0.693831942 -2.172773023  0.042128779  1.887960615 -0.385105219  2.568533485 
##          745          746          747          748          749          750 
##  2.127489835 -2.103253983  1.214504385  1.608752932 -0.371397656  1.250865681 
##          751          752          753          754          755          756 
##  0.619892236  0.599062944 -2.051375951  2.422626354  3.138789907  0.794477590 
##          758          759          762          764          765          766 
## -1.046207548  1.971577614 -0.540739313  0.606965465  2.971379871  2.428565369 
##          767          769          770          771          772          773 
##  1.628479895  1.963733772  1.039367866  0.946767722  1.693557734  1.678950675 
##          774          775          777          778          779          780 
## -0.471707668  2.033755477  2.542380111  0.171240807  0.915419151  2.196548274 
##          781          782          783          784          785          786 
## -0.681900321  1.779341071  2.229341071  2.088206858  3.365472740 -0.635656384 
##          787          788          789          790          791          792 
## -2.209963611  1.877045849 -3.181333887  2.233567914  1.150641143 -0.411766347 
##          793          794          795          796          798          799 
##  0.514589859 -0.470295327  0.824455885  0.796382416  1.465585009  0.052299727 
##          842          843          844          845          846          847 
##  1.388928971  1.608565369 -0.741322362  0.518934060  2.842819007  2.306104289 
##          849          850          851          852          853          854 
##  1.618902176  1.632241047 -0.551654079  0.700914181  3.376601864 -0.272066180 
##          855          857          859          860          861          862 
##  2.606302032  0.345697278  0.226087674  2.095553124  2.422155573  1.429426546 
##          863          864          865          866          867          868 
##  0.568789907  1.731384961  3.024145873  1.072711828  1.763375261  0.238458190 
##          869          870          871          872          873          874 
##  1.337665873  1.062519175  1.993594709  1.955997110 -0.777314967  3.597019055 
##          875          876          878          879 
##  0.477077734  2.616606954  2.002021600  0.165226498
qplot(morph_fit, morph_res) #plot relationship of predicted vs residuals of model see no relationship 

qqnorm(morph_res)
qqline(morph_res)

plot(profile(morph_mod)) #must see a line and then we're okay (splits the V) 

#LRT test of model
morph_mod_null <- glm(culmen_mm ~ 1, #what would response var would look like in a null situation (~1)
               family = gaussian(link = "identity"), 
               data=morphology)
  
anova(morph_mod_null, morph_mod, test = "LRT") #like ratio test #look p-value and so reject the null 
## Analysis of Deviance Table
## 
## Model 1: culmen_mm ~ 1
## Model 2: culmen_mm ~ tarsus_mm
##   Resid. Df Resid. Dev Df Deviance  Pr(>Chi)    
## 1       765     6000.9                          
## 2       764     1171.7  1   4829.3 < 2.2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#t-tests of parameters
summary(morph_mod)
## 
## Call:
## glm(formula = culmen_mm ~ tarsus_mm, family = gaussian(link = "identity"), 
##     data = morphology)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -4.4081  -0.7029  -0.0328   0.7263   3.5970  
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -0.098707   0.215450  -0.458    0.647    
## tarsus_mm    0.372927   0.006646  56.116   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for gaussian family taken to be 1.533593)
## 
##     Null deviance: 6000.9  on 765  degrees of freedom
## Residual deviance: 1171.7  on 764  degrees of freedom
## AIC: 2505.4
## 
## Number of Fisher Scoring iterations: 2
#BAYESIAN####
morph_lm_bayes <- brm(culmen_mm ~ tarsus_mm, #create bayes model (y~x)
                     family = gaussian(link="identity"),
                     data= morphology)
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.2e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.22 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.024852 seconds (Warm-up)
## Chain 1:                0.022819 seconds (Sampling)
## Chain 1:                0.047671 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.1 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.028103 seconds (Warm-up)
## Chain 2:                0.02331 seconds (Sampling)
## Chain 2:                0.051413 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.1 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.027991 seconds (Warm-up)
## Chain 3:                0.026189 seconds (Sampling)
## Chain 3:                0.05418 seconds (Total)
## Chain 3: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 4).
## Chain 4: 
## Chain 4: Gradient evaluation took 1e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.1 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4: 
## Chain 4: 
## Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 4: 
## Chain 4:  Elapsed Time: 0.028423 seconds (Warm-up)
## Chain 4:                0.023474 seconds (Sampling)
## Chain 4:                0.051897 seconds (Total)
## Chain 4:
morph_lm_bayes
##  Family: gaussian 
##   Links: mu = identity; sigma = identity 
## Formula: culmen_mm ~ tarsus_mm 
##    Data: morphology (Number of observations: 766) 
## Samples: 4 chains, each with iter = 2000; warmup = 1000; thin = 1;
##          total post-warmup samples = 4000
## 
## Population-Level Effects: 
##           Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
## Intercept    -0.10      0.22    -0.53     0.34 1.00     4803     3056
## tarsus_mm     0.37      0.01     0.36     0.39 1.00     4841     3004
## 
## Family Specific Parameters: 
##       Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
## sigma     1.24      0.03     1.18     1.30 1.00     3599     2891
## 
## Samples were drawn using sampling(NUTS). For each parameter, Bulk_ESS
## and Tail_ESS are effective sample size measures, and Rhat is the potential
## scale reduction factor on split chains (at convergence, Rhat = 1).
print(summary(morph_lm_bayes), digits= 5) #summary table with 5 digits out
##  Family: gaussian 
##   Links: mu = identity; sigma = identity 
## Formula: culmen_mm ~ tarsus_mm 
##    Data: morphology (Number of observations: 766) 
## Samples: 4 chains, each with iter = 2000; warmup = 1000; thin = 1;
##          total post-warmup samples = 4000
## 
## Population-Level Effects: 
##           Estimate Est.Error l-95% CI u-95% CI    Rhat Bulk_ESS Tail_ESS
## Intercept -0.09660   0.21934 -0.53210  0.33794 1.00033     4803     3056
## tarsus_mm  0.37285   0.00679  0.35953  0.38632 1.00024     4841     3004
## 
## Family Specific Parameters: 
##       Estimate Est.Error l-95% CI u-95% CI    Rhat Bulk_ESS Tail_ESS
## sigma  1.24106   0.03176  1.18081  1.30492 1.00232     3599     2891
## 
## Samples were drawn using sampling(NUTS). For each parameter, Bulk_ESS
## and Tail_ESS are effective sample size measures, and Rhat is the potential
## scale reduction factor on split chains (at convergence, Rhat = 1).
color_scheme_set("viridis") #set a color scheme 

#visually investigate our chains
#chains converge priors look great!
plot(morph_lm_bayes) 

plot(morph_lm_bayes, par= "b_Intercept")

mcmc_trace(morph_lm_bayes)

#Look at a diagnostic of convergence
#Gelmen-Rubin statistic-- Rhat (want Rhat to be 1)
rhat(morph_lm_bayes) #can see all values are close to 1 #looks good!
## b_Intercept b_tarsus_mm       sigma        lp__ 
##   1.0003220   1.0002465   0.9997096   1.0014244
  #usually only work if off by something in the hundredths 

mcmc_acf(morph_lm_bayes) #looks good

#check the match between our data and our chains
# for distributions of data of y
pp_check(morph_lm_bayes, "dens_overlay") #green is posteriors from fit
## Using 10 posterior samples for ppc type 'dens_overlay' by default.

                #black is dist of length in bayes
                #looks like a good fit!

#is our error normal? checking our residuals
#about error and future predictions 
pp_check(morph_lm_bayes, "error_hist") #residuals look good!
## Using 10 posterior samples for ppc type 'error_hist' by default.
## `stat_bin()` using `bins = 30`. Pick better value with `binwidth`.

pp_check(morph_lm_bayes, "error_scatter") #relationship 
## Using 10 posterior samples for ppc type 'error_scatter' by default.

        #between residuals and obs values.. see that it's  
        #generally linear so we're good!
pp_check(morph_lm_bayes, "error_scatter_avg") #takes avg pred error
## Using all posterior samples for ppc type 'error_scatter_avg' by default.

          #over all posteriors


#to get at looking at fitted v residuals. did we miss a nonlinearity??
morph_res <- residuals(morph_lm_bayes) %>%
  as_tibble
morph_fit <- fitted(morph_lm_bayes) %>%
  as_tibble

plot(y=morph_res$Estimate, x=morph_fit$Estimate) #fitted vs res plot. All looks good!

5b) Three interpretations (10 points) OK, now that we have fits, take a look! Do the coefficients and their associated measures of error in their estimation match? How would we interpret the results from these different analyses differently? Or would we? Note, confint works on lm objects as well.

We see that our lm and glm models produce the exact same coefficients and the associated measures of error, bayes produces nearly the same coeff/err. We wouldn’t interpret the lm/glm results differently. Bayes is slightly diff b/c it’s asking a diff question and performing a different function but b/c we didn’t input priors into the function this is why we get nearly the same outputs. I would argue b/c we aren’t using any priors a lm or glm will work to answer the question being asked: Is there a relationship between Tarsus and culmun? Can we use tarsus length to predict culmun?

One other aspect to look at is the confidence intervals vs credible intervals. The glm and lm both have very similar confidence intervals which tells us how confident we are that our values are truly apart of this model. And if we wanted to replicate this experiment again, our true value would occur 95% percent of the time. In the bayes analysis though, we are looking at credible intervals which tell us the area that we will find 95% of our possible parameter values.

LM Coefficients: Estimate Std. Error t value Pr(>|t|)
(Intercept) -0.098707 0.215450 -0.458 0.647
tarsus_mm 0.372927 0.006646 56.116 <2e-16 *** 2.5 % 97.5 % (Intercept) -0.5216505 0.3242363 tarsus_mm 0.3598809 0.3859727

GLM Coefficients: Estimate Std. Error t value Pr(>|t|)
(Intercept) -0.098707 0.215450 -0.458 0.647
tarsus_mm 0.372927 0.006646 56.116 <2e-16 ***

             2.5 %    97.5 %

(Intercept) -0.5209805 0.3235663 tarsus_mm 0.3599015 0.3859520

BRM Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Intercept -0.09415 0.21423 -0.51603 0.32000 1.00038 4491 tarsus_mm 0.37278 0.00661 0.35975 0.38575 1.00037 4511 Tail_ESS Intercept 3111 tarsus_mm 3105

#summary of models
summary(lm_morph)
## 
## Call:
## lm(formula = culmen_mm ~ tarsus_mm, data = morphology)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -4.4081 -0.7029 -0.0328  0.7263  3.5970 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -0.098707   0.215450  -0.458    0.647    
## tarsus_mm    0.372927   0.006646  56.116   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.238 on 764 degrees of freedom
## Multiple R-squared:  0.8048, Adjusted R-squared:  0.8045 
## F-statistic:  3149 on 1 and 764 DF,  p-value: < 2.2e-16
summary(morph_mod)
## 
## Call:
## glm(formula = culmen_mm ~ tarsus_mm, family = gaussian(link = "identity"), 
##     data = morphology)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -4.4081  -0.7029  -0.0328   0.7263   3.5970  
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) -0.098707   0.215450  -0.458    0.647    
## tarsus_mm    0.372927   0.006646  56.116   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for gaussian family taken to be 1.533593)
## 
##     Null deviance: 6000.9  on 765  degrees of freedom
## Residual deviance: 1171.7  on 764  degrees of freedom
## AIC: 2505.4
## 
## Number of Fisher Scoring iterations: 2
print(summary(morph_lm_bayes), digits= 5) #summary table with 5 digits out
##  Family: gaussian 
##   Links: mu = identity; sigma = identity 
## Formula: culmen_mm ~ tarsus_mm 
##    Data: morphology (Number of observations: 766) 
## Samples: 4 chains, each with iter = 2000; warmup = 1000; thin = 1;
##          total post-warmup samples = 4000
## 
## Population-Level Effects: 
##           Estimate Est.Error l-95% CI u-95% CI    Rhat Bulk_ESS Tail_ESS
## Intercept -0.09660   0.21934 -0.53210  0.33794 1.00033     4803     3056
## tarsus_mm  0.37285   0.00679  0.35953  0.38632 1.00024     4841     3004
## 
## Family Specific Parameters: 
##       Estimate Est.Error l-95% CI u-95% CI    Rhat Bulk_ESS Tail_ESS
## sigma  1.24106   0.03176  1.18081  1.30492 1.00232     3599     2891
## 
## Samples were drawn using sampling(NUTS). For each parameter, Bulk_ESS
## and Tail_ESS are effective sample size measures, and Rhat is the potential
## scale reduction factor on split chains (at convergence, Rhat = 1).
#conf intervals for each model
confint(morph_mod) 
## Waiting for profiling to be done...
##                  2.5 %    97.5 %
## (Intercept) -0.5209805 0.3235663
## tarsus_mm    0.3599015 0.3859520
confint(lm_morph)
##                  2.5 %    97.5 %
## (Intercept) -0.5216505 0.3242363
## tarsus_mm    0.3598809 0.3859727

5c) Everyday I’m Profilin’ (10 points) For your likelihood fit, are your profiles well behaved? For just the slope, use grid sampling to create a profile. You’ll need to write functions for this, sampling the whole grid of slope and intercept, and then take out the relevant slices as we have done before. Use the results from the fit above to provide the reasonable bounds of what you should be profiling over (3SE should do). Is it well behaved? Plot the profile and give the 80% and 95% CI (remember how we use the chisq here!). Verify your results with profileModel.

Our profiles seem to be well behaved behaved. Our plot and ProfileModel plot both give us nice parabolic curves.

95% CI is slopes 0.3706061 - 0.3722222 80% CI is slopes 0.3710101 - 0.3718182

#write a function sampling the whole grid of slope and intercept, and then take out the relevant slices as we have done before
likhood_fun <- function(slope, intercept){ #function given slope/int
  morph_new <- slope* morphology$tarsus_mm + intercept  
  #creating data set that includes operation for linear regression
  sum(dnorm(morphology$culmen_mm, morph_new, log = TRUE)) #sum all the log data
  }

morph_grid <- tibble(slope = seq(0.35, 0.39, length.out = 100), #create tibble that seq over 3SE of our slope and intercept values found from 5b
                     intercept = seq(-0.74, 0.55, length.out= 100)) %>%
  group_by(slope, intercept) %>% #group by slope and int
  mutate(loglikelihood = likhood_fun(slope,intercept), #put our slope/int through likhood function and set deviance to measure our fit
    deviance = -2 * loglikelihood) %>%  
  ungroup()


like_plot <- ggplot(data=morph_grid, #plot our data to look at slope and the loglikelihood of these sloeps
       mapping=aes(x=slope,
                   y=loglikelihood)) +
  geom_point()+
   geom_line(data= morph_grid%>% #create ranges of where our slope is within 95% CI. #'s grabbed from here to also put the range of our slopes in answer above
               filter(loglikelihood >= (max(loglikelihood)- qchisq(0.95, df=1)/2)) %>%
               as.data.frame()%>%
               head(),
             aes(x=slope,
                 y=loglikelihood),
             color="pink",
             size=4)+
  geom_line(data= morph_grid %>% #create ranges of where our slope is within 80% CI. #'s grabbed from here to also put the range of our slopes in answer above
               filter(loglikelihood >= (max(loglikelihood)- qchisq(0.80, df=1)/2)),
             aes(x=slope,
                 y=loglikelihood),
             color="blue",
            size=4)


#check with profileModel and I'd say our model is well behaved.This is our deviance profile and it shows us nice parabolic curved shapes
morph_prof_model <- profileModel(morph_mod, 
                     objective = "ordinaryDeviance",
                     quantile = qchisq(0.8,0.95))
## Preliminary iteration .. Done
## 
## Profiling for parameter (Intercept) ... Done
## Profiling for parameter tarsus_mm ... Done
plot(morph_prof_model)

5d) The Power of the Prior (10 points) This data set is pretty big. After excluding NAs in the variables we’re interested in, it’s over 766 lines of data! Now, a lot of data can overwhelm a strong prior. But only to a point. Show first that there is enough data here that a prior for the slope with an estimate of 0.7 and a sd of 0.01 is overwhelmed and produces similar results to the default prior. How different are the results from the original? Second, randomly sample 10, 100, 300, and 500 data points. At which level is our prior overwhelmed (e.g., the prior slope becomes highly unlikely)? Communicate that visually in the best way you feel gets the point across, and explain your reasoning.

Old vs new we see that there is a quite a difference in our y-intercept and also a difference in our slope. It makes sense though that we have a slight shift toward a 0.7 slope since we put that in as a prior our y-int is really different though! When we start to change our sample size though we see that a larger sample size (over 10) overwhelms our prior. We would expect this to happen and I’d argue 10 doesn’t overwhelm it b/c there is a good amount of overlap between that and our origonal. Once we look at the n=100 though only a small part of the tail maybe overlaps with the origonal. But again this makes sense b/c 10 samples is very different than 100!

BRM- old Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Intercept -0.09415 0.21423 -0.51603 0.32000 1.00038 4491 tarsus_mm 0.37278 0.00661 0.35975 0.38575 1.00037 4511

BRM- new Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS Intercept -4.25 0.27 -4.78 -3.72 1.00 1980 2072 tarsus_mm 0.50 0.01 0.49 0.52 1.00 1925 2069

#set up strong prior
morph_lm_bayes <- brm(culmen_mm ~ tarsus_mm, #create bayes model (y~x)
                     family = gaussian(link="identity"),
                     data= morphology%>%
                       sample_n(10),
                      #prior for the slope
                      prior = c(prior(coef = "tarsus_mm",
                                      #with an estimate of 0.7 (mean) and sd of 0.01
                                      prior = normal(0.7,0.01))),
                      chains = 3)
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL '82ea377065a124f0e1eacafbba221e5e' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 1.9e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.19 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.031872 seconds (Warm-up)
## Chain 1:                0.015394 seconds (Sampling)
## Chain 1:                0.047266 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL '82ea377065a124f0e1eacafbba221e5e' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.1 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.031744 seconds (Warm-up)
## Chain 2:                0.014011 seconds (Sampling)
## Chain 2:                0.045755 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL '82ea377065a124f0e1eacafbba221e5e' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 9e-06 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.039676 seconds (Warm-up)
## Chain 3:                0.016508 seconds (Sampling)
## Chain 3:                0.056184 seconds (Total)
## Chain 3:
#randomly sample 10 data points, put sample n=10 to only pull 10 random draws from our morph data
morph_lm_bayes_10 <- brm(culmen_mm ~ tarsus_mm, #create bayes model (y~x)
                     family = gaussian(link="identity"),
                     data= morphology%>%
                       sample_n(10),
                      #prior for the slope
                      prior = c(prior(coef = "tarsus_mm",
                                      #with an estimate of 0.7 (mean) and sd of 0.01
                                      prior = normal(0.7,0.01))),
                      chains = 3)
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'dafad7e2c339b2b81ac6f30b057e8a92' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 1.9e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.19 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.031792 seconds (Warm-up)
## Chain 1:                0.017181 seconds (Sampling)
## Chain 1:                0.048973 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'dafad7e2c339b2b81ac6f30b057e8a92' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 8e-06 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.08 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.03163 seconds (Warm-up)
## Chain 2:                0.01547 seconds (Sampling)
## Chain 2:                0.0471 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'dafad7e2c339b2b81ac6f30b057e8a92' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1.7e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.17 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.040119 seconds (Warm-up)
## Chain 3:                0.01698 seconds (Sampling)
## Chain 3:                0.057099 seconds (Total)
## Chain 3:
#randomly sample 100 data points put sample n=100 to only pull 100 random draws from our morph data
morph_lm_bayes_100 <- brm(culmen_mm ~ tarsus_mm, #create bayes model (y~x)
                     family = gaussian(link="identity"),
                     data= morphology%>%
                       sample_n(100),
                      #prior for the slope
                      prior = c(prior(coef = "tarsus_mm",
                                      #with an estimate of 0.7 (mean) and sd of 0.01
                                      prior = normal(0.7,0.01))),
                      chains = 3)
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'e06e4b78b9092964909aba153e1b2b2c' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.4e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.24 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.023605 seconds (Warm-up)
## Chain 1:                0.015893 seconds (Sampling)
## Chain 1:                0.039498 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'e06e4b78b9092964909aba153e1b2b2c' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1.3e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.13 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.023831 seconds (Warm-up)
## Chain 2:                0.016143 seconds (Sampling)
## Chain 2:                0.039974 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'e06e4b78b9092964909aba153e1b2b2c' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 9e-06 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.020459 seconds (Warm-up)
## Chain 3:                0.014782 seconds (Sampling)
## Chain 3:                0.035241 seconds (Total)
## Chain 3:
#randomly sample 300 data points put sample n=300 to only pull 300 random draws from our morph data
morph_lm_bayes_300 <- brm(culmen_mm ~ tarsus_mm, #create bayes model (y~x)
                     family = gaussian(link="identity"),
                     data= morphology%>%
                       sample_n(300),
                      #prior for the slope
                      prior = c(prior(coef = "tarsus_mm",
                                      #with an estimate of 0.7 (mean) and sd of 0.01
                                      prior = normal(0.7,0.01))),
                      chains = 3)
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'd6c67efac8a583b096d2bf215fb63706' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.1e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.023759 seconds (Warm-up)
## Chain 1:                0.022715 seconds (Sampling)
## Chain 1:                0.046474 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'd6c67efac8a583b096d2bf215fb63706' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1.1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.02499 seconds (Warm-up)
## Chain 2:                0.021027 seconds (Sampling)
## Chain 2:                0.046017 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'd6c67efac8a583b096d2bf215fb63706' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 2e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.2 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.030612 seconds (Warm-up)
## Chain 3:                0.021395 seconds (Sampling)
## Chain 3:                0.052007 seconds (Total)
## Chain 3:
#randomly sample 500 data points put sample n=500 to only pull 500 random draws from our morph data
morph_lm_bayes_500 <- brm(culmen_mm ~ tarsus_mm, #create bayes model (y~x)
                     family = gaussian(link="identity"),
                     data= morphology%>%
                       sample_n(500),
                      #prior for the slope
                      prior = c(prior(coef = "tarsus_mm",
                                      #with an estimate of 0.7 (mean) and sd of 0.01
                                      prior = normal(0.7,0.01))),
                      chains = 3)
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'c3862f0336279ab33ad03942a6f73591' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.3e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.23 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.031429 seconds (Warm-up)
## Chain 1:                0.028634 seconds (Sampling)
## Chain 1:                0.060063 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'c3862f0336279ab33ad03942a6f73591' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 9e-06 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.033541 seconds (Warm-up)
## Chain 2:                0.029052 seconds (Sampling)
## Chain 2:                0.062593 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'c3862f0336279ab33ad03942a6f73591' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 9e-06 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.02907 seconds (Warm-up)
## Chain 3:                0.025719 seconds (Sampling)
## Chain 3:                0.054789 seconds (Total)
## Chain 3:
#set them all as df
morph_lm_bayes <- as.data.frame(morph_lm_bayes)

morph_lm_bayes_10 <- as.data.frame(morph_lm_bayes_10)

morph_lm_bayes_100 <- as.data.frame(morph_lm_bayes_100)

morph_lm_bayes_300 <- as.data.frame(morph_lm_bayes_300)

morph_lm_bayes_500 <- as.data.frame(morph_lm_bayes_500)


ggplot()+ #plot all of our different samples to visualize. The black fill is our actual prior 
  geom_density(data = morph_lm_bayes,
       mapping = aes(x=b_tarsus_mm),
       alpha=0.2, fill="black") +
  geom_density(data = morph_lm_bayes_10,
       mapping = aes(x=b_tarsus_mm),
       alpha=0.2, color="purple") +
  geom_density(data = morph_lm_bayes_100,
       mapping = aes(x=b_tarsus_mm),
       alpha=0.2, color="orange") +
  geom_density(data = morph_lm_bayes_300,
       mapping = aes(x=b_tarsus_mm),
       alpha=0.2, color="red") +
  geom_density(data = morph_lm_bayes_500,
       mapping = aes(x=b_tarsus_mm),
       alpha=0.2, color="yellow")

  1. Cross-Validation and Priors (15 points) There is some interesting curvature in the culmen-tarsus relationship. Is the relationship really linear? Squared? Cubic? Exponential? Use one of the cross-validation techniques we explored to show which model is more predictive. Justify your choice of technique. Do you get a clear answer? What does it say?

It seems that our relationship between culmen and tarsus is cubic as predicted by both Kfold and LOO models. We got a clear answer from our tibble outputs when comparing polys b/c of zero values for the cubic model. Although we did run both cross-val techniques, in the future I would say that running just Kfold would have worked best b/c our data set is large. When doing leave one out cross val this is technically used for smaller data sets b/c it only pulls one data point out each time which helps to keep each sample size big. Our data set is large though so this isn’t really something we need to worry about.

KFOLD elpd_diff se_diff morph_cub 0.0 0.0 morph_four -2.6 1.3 morph_lm -12.9 5.1 morph_sq -14.2 5.5 morph_int -638.5 23.7

LOO elpd_diff se_diff morph_cub 0.0 0.0 morph_four -0.9 0.3 morph_lm -12.1 5.3 morph_sq -13.2 5.6 morph_int -636.5 23.7

#set up our polys to compare####
morph_lm <- brm(culmen_mm~tarsus_mm, 
                data=morphology,
                family= gaussian(link="identity"))
## Compiling Stan program...
## recompiling to avoid crashing R session
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.1e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.025406 seconds (Warm-up)
## Chain 1:                0.023558 seconds (Sampling)
## Chain 1:                0.048964 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 7e-06 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.07 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.02659 seconds (Warm-up)
## Chain 2:                0.019583 seconds (Sampling)
## Chain 2:                0.046173 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1.5e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.15 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.027825 seconds (Warm-up)
## Chain 3:                0.024235 seconds (Sampling)
## Chain 3:                0.05206 seconds (Total)
## Chain 3: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 4).
## Chain 4: 
## Chain 4: Gradient evaluation took 9e-06 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4: 
## Chain 4: 
## Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 4: 
## Chain 4:  Elapsed Time: 0.02895 seconds (Warm-up)
## Chain 4:                0.025778 seconds (Sampling)
## Chain 4:                0.054728 seconds (Total)
## Chain 4:
morph_int <-  brm(culmen_mm~1, 
                  data=morphology,
                family= gaussian(link="identity")) 
## Compiling Stan program...
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL '44e07d1ba7fab7a41d9b7e567b76929b' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 3.7e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.37 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.055118 seconds (Warm-up)
## Chain 1:                0.054968 seconds (Sampling)
## Chain 1:                0.110086 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL '44e07d1ba7fab7a41d9b7e567b76929b' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1.9e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.19 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.059749 seconds (Warm-up)
## Chain 2:                0.05285 seconds (Sampling)
## Chain 2:                0.112599 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL '44e07d1ba7fab7a41d9b7e567b76929b' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1.8e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.18 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.053667 seconds (Warm-up)
## Chain 3:                0.051608 seconds (Sampling)
## Chain 3:                0.105275 seconds (Total)
## Chain 3: 
## 
## SAMPLING FOR MODEL '44e07d1ba7fab7a41d9b7e567b76929b' NOW (CHAIN 4).
## Chain 4: 
## Chain 4: Gradient evaluation took 2e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.2 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4: 
## Chain 4: 
## Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 4: 
## Chain 4:  Elapsed Time: 0.05802 seconds (Warm-up)
## Chain 4:                0.061737 seconds (Sampling)
## Chain 4:                0.119757 seconds (Total)
## Chain 4:
morph_sq <- brm(culmen_mm~poly(tarsus_mm, 2), 
                data=morphology,
                family= gaussian(link="identity"))
## Compiling Stan program...
## recompiling to avoid crashing R session
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.5e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.25 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.045202 seconds (Warm-up)
## Chain 1:                0.03294 seconds (Sampling)
## Chain 1:                0.078142 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1.1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.046675 seconds (Warm-up)
## Chain 2:                0.036237 seconds (Sampling)
## Chain 2:                0.082912 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1.1e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.046182 seconds (Warm-up)
## Chain 3:                0.034218 seconds (Sampling)
## Chain 3:                0.0804 seconds (Total)
## Chain 3: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 4).
## Chain 4: 
## Chain 4: Gradient evaluation took 1.6e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4: 
## Chain 4: 
## Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 4: 
## Chain 4:  Elapsed Time: 0.042654 seconds (Warm-up)
## Chain 4:                0.034908 seconds (Sampling)
## Chain 4:                0.077562 seconds (Total)
## Chain 4:
morph_cub <- brm(culmen_mm~poly(tarsus_mm, 3), 
                 data=morphology,
                family= gaussian(link="identity"))
## Compiling Stan program...
## recompiling to avoid crashing R session
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.8e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.28 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.077607 seconds (Warm-up)
## Chain 1:                0.03647 seconds (Sampling)
## Chain 1:                0.114077 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 9e-06 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.064797 seconds (Warm-up)
## Chain 2:                0.035297 seconds (Sampling)
## Chain 2:                0.100094 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1.3e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.13 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.076486 seconds (Warm-up)
## Chain 3:                0.044912 seconds (Sampling)
## Chain 3:                0.121398 seconds (Total)
## Chain 3: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 4).
## Chain 4: 
## Chain 4: Gradient evaluation took 1.2e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.12 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4: 
## Chain 4: 
## Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 4: 
## Chain 4:  Elapsed Time: 0.061962 seconds (Warm-up)
## Chain 4:                0.036221 seconds (Sampling)
## Chain 4:                0.098183 seconds (Total)
## Chain 4:
morph_four <- brm(culmen_mm~poly(tarsus_mm, 4), 
                  data=morphology,
                family= gaussian(link="identity"))
## Compiling Stan program...
## recompiling to avoid crashing R session
## Trying to compile a simple C file
## Running /Library/Frameworks/R.framework/Resources/bin/R CMD SHLIB foo.c
## clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG   -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/Rcpp/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/unsupported"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/BH/include" -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/src/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppParallel/include/"  -I"/Library/Frameworks/R.framework/Versions/4.0/Resources/library/rstan/include" -DEIGEN_NO_DEBUG  -DBOOST_DISABLE_ASSERTS  -DBOOST_PENDING_INTEGER_LOG2_HPP  -DSTAN_THREADS  -DBOOST_NO_AUTO_PTR  -include '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp'  -D_REENTRANT -DRCPP_PARALLEL_USE_TBB=1   -I/usr/local/include   -fPIC  -Wall -g -O2  -c foo.c -o foo.o
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:88:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:1: error: unknown type name 'namespace'
## namespace Eigen {
## ^
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/src/Core/util/Macros.h:613:16: error: expected ';' after top level declarator
## namespace Eigen {
##                ^
##                ;
## In file included from <built-in>:1:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/StanHeaders/include/stan/math/prim/mat/fun/Eigen.hpp:13:
## In file included from /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Dense:1:
## /Library/Frameworks/R.framework/Versions/4.0/Resources/library/RcppEigen/include/Eigen/Core:96:10: fatal error: 'complex' file not found
## #include <complex>
##          ^~~~~~~~~
## 3 errors generated.
## make: *** [foo.o] Error 1
## Start sampling
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 1).
## Chain 1: 
## Chain 1: Gradient evaluation took 2.6e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.26 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 1: 
## Chain 1:  Elapsed Time: 0.074392 seconds (Warm-up)
## Chain 1:                0.034481 seconds (Sampling)
## Chain 1:                0.108873 seconds (Total)
## Chain 1: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 2).
## Chain 2: 
## Chain 2: Gradient evaluation took 1.1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2: 
## Chain 2: 
## Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 2: 
## Chain 2:  Elapsed Time: 0.0806 seconds (Warm-up)
## Chain 2:                0.036416 seconds (Sampling)
## Chain 2:                0.117016 seconds (Total)
## Chain 2: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 3).
## Chain 3: 
## Chain 3: Gradient evaluation took 1.5e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.15 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3: 
## Chain 3: 
## Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 3: 
## Chain 3:  Elapsed Time: 0.087425 seconds (Warm-up)
## Chain 3:                0.03494 seconds (Sampling)
## Chain 3:                0.122365 seconds (Total)
## Chain 3: 
## 
## SAMPLING FOR MODEL 'e754aa52dd0feb2be6d67820a7a0fe08' NOW (CHAIN 4).
## Chain 4: 
## Chain 4: Gradient evaluation took 2.4e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.24 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4: 
## Chain 4: 
## Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
## Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
## Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
## Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
## Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
## Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
## Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
## Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
## Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
## Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
## Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
## Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
## Chain 4: 
## Chain 4:  Elapsed Time: 0.075569 seconds (Warm-up)
## Chain 4:                0.039509 seconds (Sampling)
## Chain 4:                0.115078 seconds (Total)
## Chain 4:
#KFOLD####
#all our kfolds with diff poly
morph_k <- kfold(morph_lm, k = 10) 
## Fitting model 1 out of 10
## Fitting model 2 out of 10
## Fitting model 3 out of 10
## Fitting model 4 out of 10
## Fitting model 5 out of 10
## Fitting model 6 out of 10
## Fitting model 7 out of 10
## Fitting model 8 out of 10
## Fitting model 9 out of 10
## Fitting model 10 out of 10
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
morph_k
## 
## Based on 10-fold cross-validation
## 
##            Estimate   SE
## elpd_kfold  -1252.8 21.8
## p_kfold         3.2  0.6
## kfoldic      2505.7 43.7
morph_k_int <- kfold(morph_int, k = 10) 
## Fitting model 1 out of 10
## Fitting model 2 out of 10
## Fitting model 3 out of 10
## Fitting model 4 out of 10
## Fitting model 5 out of 10
## Fitting model 6 out of 10
## Fitting model 7 out of 10
## Fitting model 8 out of 10
## Fitting model 9 out of 10
## Fitting model 10 out of 10
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
morph_k_int
## 
## Based on 10-fold cross-validation
## 
##            Estimate   SE
## elpd_kfold  -1877.0 13.4
## p_kfold         1.4  0.4
## kfoldic      3754.0 26.9
morph_k_sq <- kfold(morph_sq, k = 10) 
## Fitting model 1 out of 10
## Fitting model 2 out of 10
## Fitting model 3 out of 10
## Fitting model 4 out of 10
## Fitting model 5 out of 10
## Fitting model 6 out of 10
## Fitting model 7 out of 10
## Fitting model 8 out of 10
## Fitting model 9 out of 10
## Fitting model 10 out of 10
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
morph_k_sq
## 
## Based on 10-fold cross-validation
## 
##            Estimate   SE
## elpd_kfold  -1255.6 22.0
## p_kfold         6.2  1.0
## kfoldic      2511.2 44.1
morph_k_cub <- kfold(morph_cub, k = 10) 
## Fitting model 1 out of 10
## Fitting model 2 out of 10
## Fitting model 3 out of 10
## Fitting model 4 out of 10
## Fitting model 5 out of 10
## Fitting model 6 out of 10
## Fitting model 7 out of 10
## Fitting model 8 out of 10
## Fitting model 9 out of 10
## Fitting model 10 out of 10
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
morph_k_cub
## 
## Based on 10-fold cross-validation
## 
##            Estimate   SE
## elpd_kfold  -1240.6 22.0
## p_kfold         4.5  0.7
## kfoldic      2481.2 44.0
morph_k_four <- kfold(morph_four, k = 10) 
## Fitting model 1 out of 10
## Fitting model 2 out of 10
## Fitting model 3 out of 10
## Fitting model 4 out of 10
## Fitting model 5 out of 10
## Fitting model 6 out of 10
## Fitting model 7 out of 10
## Fitting model 8 out of 10
## Fitting model 9 out of 10
## Fitting model 10 out of 10
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
## Start sampling
## Warning: UNRELIABLE VALUE: Future ('<none>') unexpectedly generated random
## numbers without specifying argument '[future.]seed'. There is a risk that those
## random numbers are not statistically sound and the overall results might be
## invalid. To fix this, specify argument '[future.]seed', e.g. 'seed=TRUE'. This
## ensures that proper, parallel-safe random numbers are produced via the L'Ecuyer-
## CMRG method. To disable this check, use [future].seed=NULL, or set option
## 'future.rng.onMisuse' to "ignore".
morph_k_four
## 
## Based on 10-fold cross-validation
## 
##            Estimate   SE
## elpd_kfold  -1243.0 22.0
## p_kfold         6.8  0.9
## kfoldic      2486.1 44.0
#compare kfolds
loo_compare(morph_k_four,morph_k_cub, morph_k_sq, morph_k_int, morph_k)
##            elpd_diff se_diff
## morph_cub     0.0       0.0 
## morph_four   -2.4       1.0 
## morph_lm    -12.3       5.2 
## morph_sq    -15.0       5.7 
## morph_int  -636.4      23.8
#LOO####
#loo of our poly's
morph_loo <- loo(morph_lm)
morph_loo
## 
## Computed from 4000 by 766 log-likelihood matrix
## 
##          Estimate   SE
## elpd_loo  -1252.8 21.9
## p_loo         3.1  0.3
## looic      2505.6 43.7
## ------
## Monte Carlo SE of elpd_loo is 0.0.
## 
## All Pareto k estimates are good (k < 0.5).
## See help('pareto-k-diagnostic') for details.
morph_loo_int <- loo(morph_int)
morph_loo_int
## 
## Computed from 4000 by 766 log-likelihood matrix
## 
##          Estimate   SE
## elpd_loo  -1877.1 13.5
## p_loo         1.5  0.1
## looic      3754.1 26.9
## ------
## Monte Carlo SE of elpd_loo is 0.0.
## 
## All Pareto k estimates are good (k < 0.5).
## See help('pareto-k-diagnostic') for details.
morph_loo_sq <- loo(morph_sq)
morph_loo_sq
## 
## Computed from 4000 by 766 log-likelihood matrix
## 
##          Estimate   SE
## elpd_loo  -1254.0 21.9
## p_loo         4.6  0.7
## looic      2508.0 43.9
## ------
## Monte Carlo SE of elpd_loo is 0.0.
## 
## All Pareto k estimates are good (k < 0.5).
## See help('pareto-k-diagnostic') for details.
morph_loo_cub <- loo(morph_cub)
morph_loo_cub
## 
## Computed from 4000 by 766 log-likelihood matrix
## 
##          Estimate   SE
## elpd_loo  -1240.5 22.1
## p_loo         4.4  0.4
## looic      2481.0 44.1
## ------
## Monte Carlo SE of elpd_loo is 0.0.
## 
## All Pareto k estimates are good (k < 0.5).
## See help('pareto-k-diagnostic') for details.
morph_loo_four <- loo(morph_four)
morph_loo_four
## 
## Computed from 4000 by 766 log-likelihood matrix
## 
##          Estimate   SE
## elpd_loo  -1241.4 22.0
## p_loo         5.2  0.5
## looic      2482.8 44.0
## ------
## Monte Carlo SE of elpd_loo is 0.0.
## 
## All Pareto k estimates are good (k < 0.5).
## See help('pareto-k-diagnostic') for details.
#compare loos
loo_compare(morph_loo, morph_loo_int,morph_loo_sq,morph_loo_cub,morph_loo_four)
##            elpd_diff se_diff
## morph_cub     0.0       0.0 
## morph_four   -0.9       0.3 
## morph_lm    -12.3       5.3 
## morph_sq    -13.5       5.6 
## morph_int  -636.5      23.8